Reading the tea leaves at WWDC this year - there were a few comments that pointed to Apple really not wanting AMD back. Craig Federighi even had an aside at The Talk Show that implied the only reason they were considered Mac games again was because they didn't have to deal with third party GPU drivers or hardware anymore.
"The only reason" is a bit of a stretch. (about 57 minutes in)
Federighi eventually mentioned explicitly in that segment some game engine developer/providers are porting over to the iOS iPhone/iPad platform. If iOS gaming wasn't a large revenue generator over the last 3-4 years I'm skeptical that Apple would be pushing this gaming angle as hard on the Mac this year (at least in this unidimensional GPU hardware fashion). Also, to a lessor extent, if they didn't have a huge sunk cost dumped into the AR/VR headset, there also probably would be a bit less focus. ( some custom GPU software/hardware work will likely eventually leak back over to the Mac side from there (and vice versa). )
There is real revenue uplift potential here is can 'bleed' some of that gaming revenue pool over to the macOS portion of the ecosystem. That doesn't mean that revenue has to have a very high fraction of it come from the most "big budget" gaming titles that are exceptionally 'hard' on GPU resources.
So when he says something to the effect that "All (Apple Silicon) Macs have awesome graphics , capable of playing the best games". The super hardcore gamers are going to balk at that ( cranking things down to 1080p and not having every whiz bang setting set at maximum. ). Gaming Consoles do decent revenue without having the most fire-breathing hot system set ups. I suspect Apple would be happy enough if they pulled something more like a fraction of that kind of revenue on the Mac. They are not out to be penultimate gaming hardware option. Just a good one that is competitive with Consoles (without having to match them on price).
As long as the "GPU consistency" line up has to span all the way down to the iPhones , then that is implicitly the bigger driver pushing the 3rd Party GPUs out. Apple shifting from licensing the GPU ( Imagination Tech) to them building them is partially driving the 'scope creep' here. For Macs overall, the biggest GPU by units was Intel over the last 10 years (or so). Where the Mac Pro could being with cards "sucked in" from the PC gaming market really wasn't the point. It is the bulk of the Mac product range where thermal and space constraints were far more pressing. The primary point was to kill off not just iGPUs but also dGPUs in the laptop space. 70+% of all Macs sold are laptops. So that just sets the whole tone for graphics stack for the platform. ( once throw on top all the desktops that are using laptop SoCs ( Mini , iMac 24" , entry Studio , etc. ) there is relatively little left that isn't Apple GPU based.
Start tossing plain Mx SoC into mid-upper range iPad and the 'blackhole' on graphics policy is even bigger. [ Indeed, at WWDC session on what is new in "drivers" space the primary headline grabber was that the modern driverkit API is portable to the iPads. ( "Mac" drivers on iPad. Can see more clearly why there is now "Graphics" subclass in the driverkit API space now. If portability to iPad is a core requirement, then there are not going to be eGPU or PCI-e slots there. ) . The legacy IOKit had a "graphics" object class. That is not being mapped over. ]
The M1 -> M2 focus gives at higher transistor budget allocation to GPUs . That lowers pressure on provisioning eGPUs for the new laptops (and plain Mx desktops) even more. When plain Mx goes to TSMC N3 ... rinse and repeat. Apple is also playing the "long game" here. It isn't quite where the M-series is now but where it is going to be 2-3 years for the primary focus area for the Mac product line up.
So WWDC 2022 had lots of stuff on how to optimize for Apple GPU. ( Just like would have for a XBox or Playstation developer convention). Lots of "do more with the hardware you got" rather than "port to the latest fire breathing dragon hardware (e.g, a Nvidia developer convention). )
P.S. Where Apple does still appear to be painting themselves into a corner though is on the GPGPU front. For graphics can shave off the upper 10% of the hardware and survive (e.g., XBox/Playstation). For more general GPGPU compute where don't necessarily have to host compute and results display on the same card there is a big gap. Especially once get into the high end single user workstation space.
Apple still playing "catch up" on PyTorch and Ai/ML Training. A compute accelerator doesn't have to have drivers that live down in the "Apple only" kernel space.