The fact that AMD's GPU fits in a (silly) 35W TDP budget does not make their GPU more efficient than NVIDIA's. It just means that they produced a GPU with a 35W TDP. No more, no less. Efficiency is performance per watt, and NVIDIA has been miles ahead of AMD in this area for the last few generations. A simple example would be the 120W GTX 1060 competing with and normally beating the 150W RX 480.
If Apple went with NVIDIA this round, you would see some kind of 35W NVIDIA GPU (probably some variant of GP107) and it would still be more efficient than an AMD GPU with the same power budget.
I agree, the argument for AMD is a confusing one, and I for one coming from Film and Video, have seen all the differences of a GPU when taken to extremes. We push whatever GPU's we have to their absolute limits. Their was a time when AMD and Nvidia where kinda neck and neck, but the small % that Nvidia Had over AMD has eventually gotten too large as the manufacturing process has gotten smaller and new and better designs have been implemented. Its sadly not a competition anymore and AMD in all of our tests doesn't make scene, even at a cheaper price.
First , we tested the MacBook pro with the 460@35Watts and it was only a bit faster at times, than the 960M@50Watts and was never faster than the 965M@(20Watts-50Watts), it had bursts that got to almost to 965M, but I assume that is the 4GB of Ram on the GPU. Before the chip came out AMD believers where saying it would SMOKE all Laptop GPU's, and it absolutely has not. Its pretty MEH!
Secondly, that is just the HARDWARE layer, where it was MEH!, now we then jump to SOFTWARE layer and we are totally screwed. OpenCL was and is a great IDEA but has never been fully used or optimized by the software that is supposed to take advantage of it. Everyone wants their own framework, METAL for Apple, and VULKAN/Moleten for AMD. Well we get screwed by both for different reasons.
Metal and Vulkan dare I say it is gonna be screwed?? because Apple doesn't care about high end. The software layer will take advantage and harness a GPU's power, but if your GPU is crappy in the first place, your Graphics layer is only as good as your GPU, and if you have a MEH! GPU, you have a MEH! graphics layer. And this is just Apple and Apple Products. Again, Apple have designed themselves into a corner. Apple has stripped all their software down to bare bones, and got rid of everything that was competing on a high end. Final Cut Pro is not kinda iMovie ish. Aperture is gone and is now Photos, which has now pro photo features. I could see Apple and few other companies writing clean nice accelerated programs under Metal, but their underlying GPU won't be that great, so its kinda lame. Their are no developers who are writing the software of today, who are staying committed to Apple, Apple has just made it so hard, by making hardware in a bubble, then having all the software vendors scramble after hardware is released. Also, how long does MacOS have? Anyone that uses high end programs will have to go to PC at some point, as Apple Strips down their hardware even, people will have to leave MacOS to take advantage of current hardware. ALSO VULKAN seems more geared towards games, their might be some OS X/PC games released simultaneously, but they will probably won't be the main super main stream titles.
If we build workflows using NVIDIA Gpu's, they are amazingly fast. Our GTX 980tis, absolutely scream for CUDA, Blackmagic Davinci Resolve and Adobe CC.
ANYWAY, this all just puts a sense of confusion on people in my industry, as to where apple is going and why they are making the decisions they are.. And the only answers is CHEAP CHEAP hardware with high markups, make as much $$ as you can as you phase out the high end, Towers, Laptops and whatever else. Force all MacOS users into something like an iMac and something like an ultrabook. Try to convince everyone that is all you need to do anything in this world. Then make sure they buy a new iPhone every few years, and thats it... oh and sell them gimmicky gadgets..
[doublepost=1481132973][/doublepost]
I am glad that you are happy with it, I am happy with my Zotac GTX 1050 Ti OC Edition in my low power desktop build with Core i7 6700T.
And I am tired of constantly arguing with people about the fact that they are talking about Mac Platform, on which TFLOPS performance is everything, for professional apps, and they use gaming benchmarks to prove their point of view.
I think TFLOPS is very misleading and all our GPU tests are with real world applications and how they perform. It needs that software layer to really test the power of the GPU, TFLOPS is almost a MYTHICAL way of talking about a GPU 's performance, and potential. Real numbers, in real applications are the only way to tell how a GPU performs.