Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
He's totally right, you are making a strawman argument. I wasn't arguing about power draw, I was arguing about performance. The 450 has only like 20% more tflops than the integrated 550 Iris Pro graphics chip on the 13" rmbp. (lol)

Apple's telling pros the 450 is all you need. It is not. That's the point. If I wanted efficiency over power and function I'd use an iPad.

As far as what Apple "could have used"... well sure, if you give them that 1) they have to use a 20-30% smaller battery and 2) have to have a super thin and light computer crippled like an iPad.
As I have said: you responded to a post which claimed that Apple is not getting the best possible hardware for their computers, with answer about MBP.

I have told you that despite what you believe to be true, it is best possible hardware for MBP. If it is not, then show that is the case. Show better GPU that can fit 35W Thermal envelope. From anyone.


Edit: update. I have forgot. News from Retail. Rumors are saying that Nvidia is preparing refresh of Pascal lineup. Higher clocks, higher core counts for specific performance brackets, lower prices(possibly...).
 
As I have said: you responded to a post which claimed that Apple is not getting the best possible hardware for their computers, with answer about MBP.

I have told you that despite what you believe to be true, it is best possible hardware for MBP. If it is not, then show that is the case. Show better GPU that can fit 35W Thermal envelope. From anyone.


Are you seriously freaking telling me what I meant by what I said? I very clearly was talking about performance. The meme was about performance. It has nothing to do with your pet argument rationalizing Apple's decision ( and I'm guessing your purchase) on your rMBP.

If you would like to say "oh woops my bad, I didn't realize by only mentioning performance, you were talking about performance. I guess my argument has nothing to do with what you're saying", that would be appropriate right about now.
 
Apple's telling pros the 450 is all you need. It is not. That's the point. If I wanted efficiency over power and function I'd use an iPad.
The first time I used Macs in a professional environment, the designers had beige PowerMacs with zip drives in the front and they were proud of telling us writers that these were the most powerful desktop computers in the world. Which they were, at the time.

Every time a new generation of Macs came out they'd be upgraded, always with the biggest, baddest desktop money could buy, and the older ones would eventually trickle down to the writers.

Beyond the obvious benefits of having a faster computer, it's a matter of pride, to a certain extent. Making people feel important by having the very best equipment available. It also made me a dedicated Mac buyer for my home machines, but it's a real shame that the computers they make these days are so hopelessly out of date.

Been a long time since I bought a real Mac and I can't see myself ever buying one again, to be honest. I'm not paying all that money for something that's 'good enough'.
 
Are you seriously freaking telling me what I meant by what I said? I very clearly was talking about performance. The meme was about performance. It has nothing to do with your pet argument rationalizing Apple's decision ( and I'm guessing your purchase) on your rMBP.

If you would like to say "oh woops my bad, I didn't realize by only mentioning performance, you were talking about performance. I guess my argument has nothing to do with what you're saying", that would be appropriate right about now.
I suppose we have to agree to disagree then.
 
The first time I used Macs in a professional environment, the designers had beige PowerMacs with zip drives in the front and they were proud of telling us writers that these were the most powerful desktop computers in the world. Which they were, at the time.

Beyond the obvious benefits of having a faster computer, it's a matter of pride, to a certain extent. Making people feel important by having the very best equipment available. It also made me a dedicated Mac buyer for my home machines, but it's a real shame that the computers they make these days are so hopelessly out of date.


This is it exactly - whatever it was Apple would push the fastest tech out there, the latest chips, etc. It was function first and then form (product design). They balanced the two, but function was the leader.

Function and form working together.

Now it is form first (design) over function (what the product actually does). I buy for function first - processing power, etc. And then Apple would make something powerful beautiful out of that raw power.

Not anymore. This Apple has lost the plot - and it is ultimately Timmy's fault.
 
  • Like
Reactions: JacobSyndeo
If you have no logical arguments to counter what I have written you have attacked me.

Do you have counter argument? You have Nvidia GPU that fits in 35W TDP and IS faster than Radeon Pro 460? Do you have ANY GPU that is faster than RP 460 in the same thermal envelope?

I thought people on this forum praised Nvidia for its efficiency. When its AMD who provided more efficient GPU, the goalpost has been moved elsewhere.

The fact that AMD's GPU fits in a (silly) 35W TDP budget does not make their GPU more efficient than NVIDIA's. It just means that they produced a GPU with a 35W TDP. No more, no less. Efficiency is performance per watt, and NVIDIA has been miles ahead of AMD in this area for the last few generations. A simple example would be the 120W GTX 1060 competing with and normally beating the 150W RX 480.

If Apple went with NVIDIA this round, you would see some kind of 35W NVIDIA GPU (probably some variant of GP107) and it would still be more efficient than an AMD GPU with the same power budget.
 
  • Like
Reactions: JacobSyndeo
The fact that AMD's GPU fits in a (silly) 35W TDP budget does not make their GPU more efficient than NVIDIA's. It just means that they produced a GPU with a 35W TDP. No more, no less. Efficiency is performance per watt, and NVIDIA has been miles ahead of AMD in this area for the last few generations. A simple example would be the 120W GTX 1060 competing with and normally beating the 150W RX 480.

If Apple went with NVIDIA this round, you would see some kind of 35W NVIDIA GPU (probably some variant of GP107) and it would still be more efficient than an AMD GPU with the same power budget.
No it would not. 75W GTX 1050 Ti, has 2.15 TFLOPs of compute power. Radeon Pro 460 at 35W has 1.86. Downclocking the GTX 1050 Ti to 35W Thermal envelope will give AT BEST the same exact performance.
 
Historically Apple have played Nvidia and ATI/AMD off against each other with each iteration of iMac/Mac Pro/Macbook(not)Pro. Switching between the vendors according to which one gave them the sweetest deal for high volume, high price, low performance output models of each type of computer.
My first iMac had an ATI Rage Pro Turbo in it, my second a Nvidia Geforce2 MX, my third a Radeon 9600.
My Mac Pro 3,1 started out with an ATI HD2600XT, then a Nvidia 8800GT, then an ATI HD4870 and finally a HD 5870 but a Nvidia GTX 280 could have been an option.
I think the best bet is to be GFX card agnostic and just run whatever fits and Apple sees fit to support. The arguments will go on forever otherwise and only Apple have any say on what goes in their machines; it is after all their train set.
My Hackintosh runs a GTX 770 just because it is supported and didn't need any flashing to work, happy days.
 
Last edited:
  • Like
Reactions: slughead
No it would not. 75W GTX 1050 Ti, has 2.15 TFLOPs of compute power. Radeon Pro 460 at 35W has 1.86. Downclocking the GTX 1050 Ti to 35W Thermal envelope will give AT BEST the same exact performance.

The RX 480 has something like 30% more TFLOPs than the GTX 1060, yet the latter still wins in nearly all real-world game benchmarks. We've had this discussion before -- raw TFLOPs is not always an accurate measurement of performance. I'm pretty tired of having this discussion with you, so I'll leave it at that. All I know is that I'm very happy with my 2016 Razer Blade with a GTX 1060, because Razer is smart enough to not have a 35W TDP for their laptops.
 
  • Like
Reactions: JacobSyndeo
The RX 480 has something like 30% more TFLOPs than the GTX 1060, yet the latter still wins in nearly all real-world game benchmarks. We've had this discussion before -- raw TFLOPs is not always an accurate measurement of performance. I'm pretty tired of having this discussion with you, so I'll leave it at that. All I know is that I'm very happy with my 2016 Razer Blade with a GTX 1060, because Razer is smart enough to not have a 35W TDP for their laptops.
I am glad that you are happy with it, I am happy with my Zotac GTX 1050 Ti OC Edition in my low power desktop build with Core i7 6700T.

And I am tired of constantly arguing with people about the fact that they are talking about Mac Platform, on which TFLOPS performance is everything, for professional apps, and they use gaming benchmarks to prove their point of view.
 
I am glad that you are happy with it, I am happy with my Zotac GTX 1050 Ti OC Edition in my low power desktop build with Core i7 6700T.

And I am tired of constantly arguing with people about the fact that they are talking about Mac Platform, on which TFLOPS performance is everything, for professional apps, and they use gaming benchmarks to prove their point of view.

Which professional app is actually limited by shader execution and thus TFLOPs? All the video ones are limited by moving of data around (i.e. over the PCIe bus).
 
The fact that AMD's GPU fits in a (silly) 35W TDP budget does not make their GPU more efficient than NVIDIA's. It just means that they produced a GPU with a 35W TDP. No more, no less. Efficiency is performance per watt, and NVIDIA has been miles ahead of AMD in this area for the last few generations. A simple example would be the 120W GTX 1060 competing with and normally beating the 150W RX 480.

If Apple went with NVIDIA this round, you would see some kind of 35W NVIDIA GPU (probably some variant of GP107) and it would still be more efficient than an AMD GPU with the same power budget.

I agree, the argument for AMD is a confusing one, and I for one coming from Film and Video, have seen all the differences of a GPU when taken to extremes. We push whatever GPU's we have to their absolute limits. Their was a time when AMD and Nvidia where kinda neck and neck, but the small % that Nvidia Had over AMD has eventually gotten too large as the manufacturing process has gotten smaller and new and better designs have been implemented. Its sadly not a competition anymore and AMD in all of our tests doesn't make scene, even at a cheaper price.

First , we tested the MacBook pro with the 460@35Watts and it was only a bit faster at times, than the 960M@50Watts and was never faster than the 965M@(20Watts-50Watts), it had bursts that got to almost to 965M, but I assume that is the 4GB of Ram on the GPU. Before the chip came out AMD believers where saying it would SMOKE all Laptop GPU's, and it absolutely has not. Its pretty MEH!

Secondly, that is just the HARDWARE layer, where it was MEH!, now we then jump to SOFTWARE layer and we are totally screwed. OpenCL was and is a great IDEA but has never been fully used or optimized by the software that is supposed to take advantage of it. Everyone wants their own framework, METAL for Apple, and VULKAN/Moleten for AMD. Well we get screwed by both for different reasons.

Metal and Vulkan dare I say it is gonna be screwed?? because Apple doesn't care about high end. The software layer will take advantage and harness a GPU's power, but if your GPU is crappy in the first place, your Graphics layer is only as good as your GPU, and if you have a MEH! GPU, you have a MEH! graphics layer. And this is just Apple and Apple Products. Again, Apple have designed themselves into a corner. Apple has stripped all their software down to bare bones, and got rid of everything that was competing on a high end. Final Cut Pro is not kinda iMovie ish. Aperture is gone and is now Photos, which has now pro photo features. I could see Apple and few other companies writing clean nice accelerated programs under Metal, but their underlying GPU won't be that great, so its kinda lame. Their are no developers who are writing the software of today, who are staying committed to Apple, Apple has just made it so hard, by making hardware in a bubble, then having all the software vendors scramble after hardware is released. Also, how long does MacOS have? Anyone that uses high end programs will have to go to PC at some point, as Apple Strips down their hardware even, people will have to leave MacOS to take advantage of current hardware. ALSO VULKAN seems more geared towards games, their might be some OS X/PC games released simultaneously, but they will probably won't be the main super main stream titles.

If we build workflows using NVIDIA Gpu's, they are amazingly fast. Our GTX 980tis, absolutely scream for CUDA, Blackmagic Davinci Resolve and Adobe CC.

ANYWAY, this all just puts a sense of confusion on people in my industry, as to where apple is going and why they are making the decisions they are.. And the only answers is CHEAP CHEAP hardware with high markups, make as much $$ as you can as you phase out the high end, Towers, Laptops and whatever else. Force all MacOS users into something like an iMac and something like an ultrabook. Try to convince everyone that is all you need to do anything in this world. Then make sure they buy a new iPhone every few years, and thats it... oh and sell them gimmicky gadgets..
[doublepost=1481132973][/doublepost]
I am glad that you are happy with it, I am happy with my Zotac GTX 1050 Ti OC Edition in my low power desktop build with Core i7 6700T.

And I am tired of constantly arguing with people about the fact that they are talking about Mac Platform, on which TFLOPS performance is everything, for professional apps, and they use gaming benchmarks to prove their point of view.
I think TFLOPS is very misleading and all our GPU tests are with real world applications and how they perform. It needs that software layer to really test the power of the GPU, TFLOPS is almost a MYTHICAL way of talking about a GPU 's performance, and potential. Real numbers, in real applications are the only way to tell how a GPU performs.
 
I think TFLOPS is very misleading and all our GPU tests are with real world applications and how they perform. It needs that software layer to really test the power of the GPU, TFLOPS is almost a MYTHICAL way of talking about a GPU 's performance, and potential. Real numbers, in real applications are the only way to tell how a GPU performs.
If you will exclude CUDA, and use the same environment for both AMD and Nvidia, 5.8 TFLOPs GPU(RX 480) from AMD is 10% slower than 6.5 TFLOPs GPU from Nvidia(GTX 1070).

You talk about real world numbers. Have you provided any? We have numbers for Radeon Pro 450 and that GPU is on the level of GTX 950M. Radeon Pro 460 can only be faster than that.
 
I keep hoping to see Mac Pro 1080 news in this Mac Pro 1080 thread, but it's all about AMD mobile chips, MacBooks, and 35W TDP envelopes.
Which baffles me, because nobody appears to be interested in this:
Edit: update. I have forgot. News from Retail. Rumors are saying that Nvidia is preparing refresh of Pascal lineup. Higher clocks, higher core counts for specific performance brackets, lower prices(possibly...).
 
If you will exclude CUDA, and use the same environment for both AMD and Nvidia, 5.8 TFLOPs GPU(RX 480) from AMD is 10% slower than 6.5 TFLOPs GPU from Nvidia(GTX 1070).

You talk about real world numbers. Have you provided any? We have numbers for Radeon Pro 450 and that GPU is on the level of GTX 950M. Radeon Pro 460 can only be faster than that.

We test with Blackmagic Davinci Resolve CUDA and OpenCL on MAC. We have laptop here with a GTX 960M that has windows and resolve, with CUDA that we tested against.. We have a CUDA expansion chasssy with 3x 980Ti, its a CUBIX. We have an Autodesk Flame with 4xK6000, about twenty some AMD/ATI GPU's scattered about..

We only tested the Radeon Pro 460 and it was only like 10% to 15% faster than the AMD Radeon R9 M370X!! Thats a 2GB GPU!! That was in Final Cut Pro X and BlackMagic Resolve was about 30% faster, but that was because of the extra 4GB of RAM, resolve takes advantage of that dramatically.. Final Cut Pro X is a better way of deterring speed, I bet if the R9 M370X had 4GB of RAM it would only be 10% to 15% faster as well.

That was a better Apples to Apples, on the PC side it was Blackmagic Resolve using CUDA and same footage, it was kinda a little faster than the R9.

We didn't test the 450, or 455, but they can only be way way way way way slower than the AMD Radeon R9 M370X.
[doublepost=1481137172][/doublepost]
I keep hoping to see Mac Pro 1080 news in this Mac Pro 1080 thread, but it's all about AMD mobile chips, MacBooks, and 35W TDP envelopes.
We have to talk about how bad AMD is at their job, and how much money Apple is making off their budget GPU's, before we can talk about how good Nvidia is at their job.

Is their a place to write nvidia for pascal drives for OS X? Maybe we start a email bombardment.
 
Last edited:
Is their a place to write nvidia for pascal drives for OS X? Maybe we start a email bombardment.

I would join in on that email bombardment if it happens. But I doubt it would go anywhere given the CEO's statement. It seems as if Nvidia isn't too interested because it's a lot of hard work for them without Apple's cooperation. :(
 
I would join in on that email bombardment if it happens. But I doubt it would go anywhere given the CEO's statement. It seems as if Nvidia isn't too interested because it's a lot of hard work for them without Apple's cooperation. :(

Yeah, I mean part of the argument back and forth for AMD is just that... As Zealots argue that AMD is still high end, everyone that is in a high end industry is saying it absolutely is not high end.. You realize Apple isn't really interested in the high end market anymore.. And because they can get AMD GPU's below cost, they can make more $$$ off of them, as long as they convince the zealots how great their GPU's are, they can pretend they are a relevant in Film, Digital Media, Graphics, CG and Gaming, which they aren't anymore.
 
I bet if the R9 M370X had 4GB of RAM it would only be 10% to 15% faster as well.....

We didn't test the 450, or 455, but they can only be way way way way way slower than the AMD Radeon R9 M370X.

The 455 has been tested to death against the M370X in every possible way. Both have 2GB. The 455 eats the 370 alive.

Please refrain from internet if you can't use it properly.
 
The 455 has been tested to death against the M370X in every possible way. Both have 2GB. The 455 eats the 370 alive.

Please refrain from internet if you can't use it properly.

This is why these forums are so exhausting, too many fanboys, and novice users who don't use high end equipment. I think me doing real world tests and giving real world examples, in this Example, a test from from Blackmagic Davinci Resolve and Final Cut Pro X, and you saying words like "the 455 eats the 370 alive" without any examples makes you the fanboy. I'll ignore the comment about "Please refrain from internet if you can't use it properly," since you might have some issues not relevant to this discussion.

FROM ANOTHER THREAD
Using a MacBook Pro 13,3 2.9GHz - 16GB RAM - AMD Radeon Pro 460 - 2TB Flash Storage
https://forums.macrumors.com/thread...bp-chime-in-here.2019393/page-2#post-24035761
I've spent the last week finishing a motion design job mainly in After Effects, AME, Cinema 4D, with a few other CC apps thrown in. After Effects renders marginally faster than the mid-2012 MBP I've been using but Cinema 4D is a huge step backwards. Way slower renders. The only area where I see significant improvement over my 4.5 year old MBP is in the newer 3D extrusion functionality in AE 2017.

I'm struggling mightily with the new keyboard and basically turned off most of the Touch Bar's functionality because I keep bumping it. Battery life is 2 hours at best. Getting my fair share of kernel panics from weird things I haven't narrowed down yet.

I WANT to believe but...man! I've owned about every pro Apple laptop since 1998 and this is the first one I've ever had that didn't feel like a step forward.

If the 460 is so fast, why do real users like this complain about how slow it is?
 
Last edited:
  • Like
Reactions: slughead
I would join in on that email bombardment if it happens. But I doubt it would go anywhere given the CEO's statement. It seems as if Nvidia isn't too interested because it's a lot of hard work for them without Apple's cooperation. :(

The updates to the web drivers have been based on Nvidia pitching new GPUs to Apple. They show up for end consumers as an added bonus.

If Apple told Nvidia that they were not going to be a part of Mac hardware in the future, Nvidia has no reason to update the web drivers with support for new GPUs.
 
  • Like
Reactions: JacobSyndeo
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.