....
Apple Silicon still doesn't have AMD GPU drivers. That's not a positive sign either.
AMD drivers are never coming to Apple Sillicon
It is a bit too early to call "never". Definitely, I would say "not soon" at this point. The snooze button is probably set to macOS 13 at this point. Two major things in play still here:
1. Apple is going to herd ("force") more developers to spend much more time and effort into optimizing for the Apple GPU . Plus it is less work for Apple as they try to mature their stack. They haven't done 128 GPU drivers every either. Apple isn't finished. Substantial number of developers are still coasting on Rosetta ( or haven't even thought of getting started ).
2. Looing at what has come out at WWDC 2021 so far I have a suspicion that Apple could be waiting on a future AMD (and/or maybe Intel ) GPU. If look at the texture compression tools that Apple just released the Intel/AMD compression is DCn tech while the Apple GPU can do and ATSC ( Apple GPU does PVRTC (PowerVR) too but nobody should be using that for new code. It is deprecated) . For iOS/iPadOS the only option is ATSC. In the mobile world GPUs ATSC is common. The biggest hold outs are AMD and Nvidia ( predominately desktop add-in-card players). Intel has some their most recent iGPUs. AMD/Samsung likely are working on something for the mobile variant that should release in next year or so.
Also AMD has "Smart Access" memory but is pretty far from the shared internal memory bus version of Unified Memory that Apple works with. Something like CXL over PCI-e v5 would have less limitations to a non-homogenous but more Unified Memory. [ AMD is doing something similar with CNDA2 and they next gen EYPC CPUs that will link them with an updated Infinity Fabric so that the CPU-GPU are on unified memory for HPC workloads. ]
Doing a GPU driver for Apple used to be a modest list of demands/requirements . I think they have shifted it to an even taller stack for an even smaller potential rewards. If Apple GPU is in every mac system sold there is no "embedded design" contract to win for a 3rd party. 3rd partys are solely left with add-in cards via slots and eGPU slots.
For a while IBM and Nvidia worked on better memory bus transfer integration between Power CPUs and Nvidia via NVLINk. When both partners want to do something it can happen.
For AMD it may be like the Lando - Vader dialog exchange in the "Empire Strikes Back".
Lando : That was never part of our deal
Vader : perhaps you feel you are being treated unfairly.
Lando : This deal is getting worse all the time.
That would be drifting into the "never" catagory. It wouldn't necessary be Apple unilaterally closing the door , but AMD walking away also. [ AMD isn'the the AMD of 6-7 years ago looking for contracts to keep the lights on. they are making money. Buying Xlinix for billions. Begging Apple for scraps isn't necessary. ]
Ironically, Intel might be more motivated to get any kind of contract so willing to put up with the "bad deal" just to move more GPU units. However, with the crypto driven shortage... they can probably sell about every GPU they can make if they keep the prices close to the MSRP even if not the most bleeding edge in performance. If Intel isn't too greedy they can get back into the GPU game without Apple.
[ Nvidia was already in the extremely likely "never" category even before the Apple Silicon transition started. ]
If Apple gets "stuck" at 128 GPU cores they will need some outlet for a less unified Memory way to add in more GPGPU cores. If they need process shrinks on SoC and RAM to more forward then they'll probably need some partners. I don't think Apple is going to have a good feed for just how much they may have painted into a corner until they have deployed to lots of customers with a large variety of real world workloads.
The Intel - MPX-with-AMD-RDNA2 is the backstop if they are.
Intel's GPU becomes competitive and it is a creditable 3 way race between Nvidia , AMD , and Intel to improve top end GPUs. Apple out hustling all three of them is dubious. Maybe folks at Apple are drinking gobs of Cupertina kool-aid but that just isn't a good long term bet to make. Removing dGPUs from the whole line of standard configurations would be a huge win Apple. There is a point there where get tooooo greedy and try to band them altogether. At that point Apple would be stepping on some customers getting more work done in a timely fashion over time and it would probably generation blowback.