Sorry friend, I don't buy this theory unless Apple is controlling AMD.
So unless there is some cues about Apple having a controlling interest on a company they depends I doubt Apple is granting indirect control on some of its products to an external interest.
Even I'll be not stranged if Apple switched its Mac Pro gpu to nVidia at last moment just to keep open doors with an alternative provider.
This is not 2013 when AMD had an very good GPU (d700) and nVidia a gray record, now nVidia shines and AMD is delayed at best and it's Polaris gpu fall well below nVidia in watts/performance.
An iMac on AMD APU it's good to stop Intel dictating cpu prices but as long Apple don't get cuff with AMD or Intel just not to repeat the PowerPC fiasco with IBM now with AMD.
This is your opinion and I respect that, despite reality looking differently from your opinion(about Apple-Nvidia relationship, and potentiality of Nvidia hardware in any of Macs).
However I will give you one point of view for efficiency.
Yesterday I have had the opportunity to test two GPUs for my home build and 1080p gaming. Intel i7-6700T, 16 GB Kingston HyperX Fury 2400 MHz, Crucial BX300 525 GB, MSI Z170 Mini-ITX motherboard, Be Quiet! Silent Loop 280 for CPU, Fractal Design Nano-S, and two GPUs. GTX 1050 Ti Gigabyte Gaming X 4 GB, and XFX Radeon RX 470 4GB Single fan. Why? Because the difference between two GPUs is just 100PLN in my country.
Then me and my friend with who I was playing with those builds tested 2 games in which we play.
Resolution 1080p, All maxed out with 4x Antialiasing. Those games were Heroes of the Storm, and Overwatch.
One thing we were curious: how efficiency will be affected by V-Sync.
We run up HotS. Power consumption for build with GTX 1050 Ti was around 110-120W at the wall. How RX 470 faired here? around 120-125W. HotS is not very demanding game and is more optimized for Nvidia architectures, so here is slight advantage for GTX 1050 Ti.
How Overwatch faired here. This is very demanding game, which cannot be maxed out but GTX 980 Ti in 4K/Epic setting(it averages 40-45 FPS).
Well the power consumption for whole system at the wall was around 140-145W for GTX 1050 Ti.
How faired here RX 470? 140-145W.
I do genuinely have very hard choice right now between the two of these GPUs. Max power consumption for both of the GPUs for whole system: GTX 1050 Ti 156W as observed, RX 470 4GB - 218W.
I do like very much the GTX 1050 Ti. But...
What would you do on my place, guys? And sorry for slight off-topic.