Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you haven't notice this is Mac Pro forum. I'm not sure how the GPU's you are telling us about are relevant.....

What we are interested here is what has AMD to offer against 1080 or even 1070.
If you get excited from the performance of the least performant card from Pascal family, that's OK, but not in the Mac Pro forum.

I'm not a fan of a brand/corporation, I'm just fan of the best tool for the job and at this moment it comes from NVidia.
AMD keeps promising the next big thing, which will dwarf Intel and NVidia and we are still waiting....
I was responding to Mayo's post and wanted to put a slight point o
Where is AMD promising anything? Have they said anything about Vega, yet? Where they have said anything about Zen that can it dwarf Intel? All they did is compared the Zen core at 3GHz in rendering in Blender to 3 GHz core from Broadwell-E at the same clock speed.

This is the biggest problem with this particular brand. People take what is in rumor sites as granted, and what is worse, think it is what AMD PROMISED!

Similar thing happened with Polaris architecture. It was WCCFTech who wanted clicks, and reposted every **** they found in the internet that hyped the Polaris beyond anything. I was on the hype train myself.
The RX480 is a sweet spot for medium range of that industry, but nVidia already answered them with the GTX1050 and GTX1060.

Care to share a link?
There is a lot of articles on this...

About compute performance of the GPUs.

And this particular thing should be important to you as professional users. Compute performance. Yes, GTX 1070 is faster. But the price difference is 170$ between the GPUs. 170$ for 10% of performance more. All Nvidia always had was the mindshare driven by gaming benchmarks.

I suppose this thread again spouts into AMD vs. Nvidia...

Lets end this, before it will turn into gigantic offtopic, for which I have apologized in the first post in which I wanted to make a point of view for efficiency. And believe me, I am not saying that AMD is better here. But there are situation in which efficiency is for both brands exactly the same.
 
But the price difference is 170$ between the GPUs.

Like that matters in a "premium" device like the Mac Pro or MacBook Pro with Touch Bar™?

The only reason Apple still uses AMD is because AMD does what Apple wants them to do and the big mean green machine (nVidia) doesn't cater to Apple.

The D300 and D700 are terrible GPUs and they WERE terrible when they were first used.

CUDA development is still huge and it's not going away any time soon. There are literally render farms using CUDA in the 3D world and scientific computing world.

Every App that I have used, CUDA has at least 5x the performance compared to OpenCL.

This includes: Davinci Resolve, Premiere Pro CC, After Effects and even CUDA only render engines like Octane.

Apple was instrumental in getting OpenCL developed as a standard, but it is completely dumping it for a macOS only capable Metal.

On top of all this, Pascal is completely obliterating AMD in every step of the way while keeping itself in a 16nm form factor and as a low wattage GPU.

I have a GTX1080 and it barely spins up the fans when it's under full load at really intensive 3d applications / games and it runs everything at max speed.

Also I've seen that video you just posted. I don't think that guy is using CUDA...he's probably using OpenCL renderers.

FYI things like Octane are only CUDA capable not OpenCL.
 
Last edited:
Like that matters in a "premium" device like the Mac Pro or MacBook Pro with Touch Bar™?

The only reason Apple still uses AMD is because AMD does what Apple wants them to do and the big mean green machine (nVidia) doesn't cater to Apple.

The D300 and D700 are terrible GPUs and they WERE terrible when they were first used.

CUDA development is still huge and it's not going away any time soon. There are literally render farms using CUDA in the 3D world and scientific computing world.

Every App that I have used, CUDA has at least 5x the performance compared to OpenCL.

This includes: Davinci Resolve, Premiere Pro CC, After Effects and even CUDA only render engines like Octane.

Apple was instrumental in getting OpenCL developed as a standard, but it is completely dumping it for a macOS only capable Metal.

On top of all this, Pascal is completely obliterating AMD in every step of the way while keeping itself in a 16nm form factor and as a low wattage GPU.

I have a GTX1080 and it barely spins up the fans when it's under full load at really intensive 3d applications / games and it runs everything at max speed.
Wait, all what you written completely contradicts to the factual data that guy provided in the film, comparing mainstream GPU with high-end. Of course, GTX 1080 will be faster. That is not the question here. The difference is that it will not be that much faster you try to spin it out here. RX 480 has 5.8 TFLOPs, GTX 1070 has 6.5 TFLOPs, and GTX 1080 has 9.2 TFLOPs. If performance difference between RX 480 and GTX 1070 is 10% in professional applications, which directly reflects the compute performance of the GPUs, then GTX 1080 can be 75-80% faster than RX 480.

Apple does not stick with AMD without reason. They simply booted Nvidia out from Apple hardware because of lawsuit that Nvidia wanted to force on Apple, and decided to develop their own hardware-software ecosystem. Thats what possible have spawned AMD-Apple relationship. Apple and AMD co-engineered the Polaris GPU according to Lisa Su words, and AMD helped Apple to create Metal with Mantle.

I know the pain of the people who rely on CUDA software that they can't use it on Macs at decent hardware. But why develop software for dead hardware platform: Nvidia on Macs is dead for foreseeable future. Do you think any developer who wants to be in Apple ecosystem to use CUDA, or Metal? That is the very reason why Nvidia is looking for engineers who can work with Metal. Because they will have to adapt to Apple platform, if they want to get the spot here.

This is the best scenario for Apple. Because they can thanks to this approach get the best possible hardware in smallest thermal envelope. Regardless of brand. Two GPUs from two brands have 7 TFLOPs of compute power in 125W. Which will they pick? The one that gets them best pricing. They control the software, hardware, everything here. And if the information SemiAccurate was able to dug up is true, Apple also can control manufacturing, of the GPUs. The synching of the process between GloFo, and Samsung was for Apple. IBM also sold the fab to GloFo, and process technology and engineers, because of Apple demand for the technology. FD-SOI, unfortunately has not seen the real world, yet.

This is how the picture looks after collecting all of the bits, we have seen past 3 years. Whether we like it or not.

P.S. Im glad you are happy with your GPU ;).
 
The 2013 was a total redesign, and deserved an event. If the update is at the end of November or early next year, it will just be updates and not necessarily deserved of an event. I would image an updated processor, new AMD chips, TB3, 10GBe, Faster/Bigger SSD.
With one major difference. In 2013 they redesigned the older model and it was only 1 year since the last update so it was still current. Now they have let the Mac Pro stagnate for 3 years so I would think that they would want to have an event to loudly proclaim that they didn't give up on the Mac Pro but worked to announce the best Mac Pro ever. All about PR and comforting the troops.
 
Yeah, and Marketshare for AMD in PC gaming jumped in last two quarters because of console sales?
Let's be honest here, it's not hard to get a good boost in marketshare when you're already way at the bottom to start with...
 
Wait, all what you written completely contradicts to the factual data that guy provided in the film, comparing mainstream GPU with high-end. Of course, GTX 1080 will be faster. That is not the question here. The difference is that it will not be that much faster you try to spin it out here. RX 480 has 5.8 TFLOPs, GTX 1070 has 6.5 TFLOPs, and GTX 1080 has 9.2 TFLOPs. If performance difference between RX 480 and GTX 1070 is 10% in professional applications, which directly reflects the compute performance of the GPUs, then GTX 1080 can be 75-80% faster than RX 480.

Apple does not stick with AMD without reason. They simply booted Nvidia out from Apple hardware because of lawsuit that Nvidia wanted to force on Apple, and decided to develop their own hardware-software ecosystem. Thats what possible have spawned AMD-Apple relationship. Apple and AMD co-engineered the Polaris GPU according to Lisa Su words, and AMD helped Apple to create Metal with Mantle.

I know the pain of the people who rely on CUDA software that they can't use it on Macs at decent hardware. But why develop software for dead hardware platform: Nvidia on Macs is dead for foreseeable future. Do you think any developer who wants to be in Apple ecosystem to use CUDA, or Metal? That is the very reason why Nvidia is looking for engineers who can work with Metal. Because they will have to adapt to Apple platform, if they want to get the spot here.

This is the best scenario for Apple. Because they can thanks to this approach get the best possible hardware in smallest thermal envelope. Regardless of brand. Two GPUs from two brands have 7 TFLOPs of compute power in 125W. Which will they pick? The one that gets them best pricing. They control the software, hardware, everything here. And if the information SemiAccurate was able to dug up is true, Apple also can control manufacturing, of the GPUs. The synching of the process between GloFo, and Samsung was for Apple. IBM also sold the fab to GloFo, and process technology and engineers, because of Apple demand for the technology. FD-SOI, unfortunately has not seen the real world, yet.

This is how the picture looks after collecting all of the bits, we have seen past 3 years. Whether we like it or not.

P.S. Im glad you are happy with your GPU ;).


Sorry, but not reading that wall of text. You can make excuses for Apple as much as you want, it's not going to change the fact that they keep rolling the ball on every release since 2012.

We're in 2016, almost 2017 in terms of what graphics can do, Apple doesn't want to join the club and wants to make it's own club which no one wants to join.

It MIGHT work on a mobile device like an iPhone and an iPad, but it won't on a personal computer.
 
  • Like
Reactions: Synchro3
Sorry, but not reading that wall of text. You can make excuses for Apple as much as you want, it's not going to change the fact that they keep rolling the ball on every release since 2012.

We're in 2016, almost 2017 in terms of what graphics can do, Apple doesn't want to join the club and wants to make it's own club which no one wants to join.

It MIGHT work on a mobile device like an iPhone and an iPad, but it won't on a personal computer.
Actually I knew you would do that.

So lets end this conversation here, before we completely derail this thread.
 
  • Like
Reactions: bounou
The only reason AMD Is still in business is because XBox and PS4 use their GPUs. And now the MacBook Pros.

And they have an extremely rare x86 license ( so can get loans to cover losses while they get their act together. )

No one in the scientific computing industry uses AMD, it's all nVidia....

Technically no.

https://www.top500.org/statistics/list/

Pull the stats on the Category -> Coprocessors.

AMD-ATI has 3 slots. It is a small percentage (of only the top 500 ) but it is non zero. Tighter budget system (not "supercomputer" 6+ digit price range ) it is likely higher.

The major competitor to Nvidia is Intel. The myopia in this forum seems always go histrionics about there only being two options, but there isn't. What is not on the top500 list is the Altera/FGPA/etc devices that some who are super heavy crunchers want don't want to let folks fully know about.

Apple isn't chasing the scientific market though. That was obvious when the Mac Pro 2013 showed up without ECC support on GPU computations.
 
  • Like
Reactions: koyoot
Sorry friend, I don't buy this theory unless Apple is controlling AMD.

Does Apple have a controlling interest in ARM? No.
Does Apple have a controlling interest in Imagination Tech (PowerVR)? No.
Sharp or any of the display panel vendors? No.

A beynond dubious criteria. Especially when it is not necessary.


So unless there is some cues about Apple having a controlling interest on a company they depends I doubt Apple is granting indirect control on some of its products to an external interest.

ARM, PowerVR, etc.?

Look AMD has an embedded graphics business. They build stuff to the constraints of MS and Sony. These MBP GPUs aren't even that specialized. Other systems are going to get them. This is probably far closer to the relationship that Apple had with Intel on Thunderbolt at the initial launch and through version 2 ( and design spec development of TB v3).
Apple probably dedicated a team of folks to bringing this subsystem to market. As such they have, for extremely rare occasion in the Mac GPU space, has early access and high demand for the part in during the initial runs. Apple probably bought up almost all ( if not all ) of them for the initial production runs. Apple doesn't have to buy a controlling interest. They could simply pay an extremely large upfront payment for goods. ( they have done that for screens they wanted to chokehold control over the supply on and other large infrastructure equipment buys.... the dubious sapphire plant adventure. )


Even I'll be not stranged if Apple switched its Mac Pro gpu to nVidia at last moment just to keep open doors with an alternative provider.

Nvidia really doesn't have much of an embedded GPU business. At least in the range of performance/implementation of the classic laptop/desktop space. ( there are some corner cases. ), but other spaces too.

Besides IBM Power space, Nvidia is more so playing the embedded market in a way were they are trying to inject their CPU along with their GPU. Nvidia doesn't have any CPU that Apple needs. Not even in the slightest.

AMD could if they get this x86 house together. MacOS is on x86 and the major of the Mac line up needs a x86+GPU package solution. Nvidia can't provide one. So they are a non starter in the Mac space for that increasingly larger subset.
 
  • Like
Reactions: koyoot
Thanks.

I'm the biggest Apple apologist there is especially in the real world against my peers, but I cannot defend what they're doing now.
I believe many in this thread are in the same boat. What worries me is the direction Apple is going when they take longtime prosumer and professional customers and leave that product category stagnant. In the G4 and G5 days IIRC there were at least the occasional spec bump, in graphics or memory options to help keep somewhat relative between big updates.

I actually was looking at MS new studio offering, and I avoid MS like the plague, using it only for Revit.
 

I believe this is why Apple will remain stagnated.

That is "NeXT" Steve not really the one who grew Apple the second time around. NeXT sputtered so he had not mastered products at that point either ( better than before but still off from where eventually got to) It isn't just "sales and marketing". One the diseases Apple has now is actually product design for product design sake. For example the $10K solid gold watch ... WTF is that as a technologic solution. Ive's crew probably milled and shaped gobs of money in the shape of gold working on the project.

The point isn't so much sales and marketing as much as out of balance. Marketing has a role, engineering has a role, design a role. Extremely skewed too far into any one of those is likely bad over the long term.

At this point the design team is "too snooty" to do speed bump upgrades. It has be the best, every, insanely great , revolutionary, incredible..... and have a 5 minus 50 shades of gray product induction video about of this new product is better than sex.

Here is an illuminating quote.

'... “We didn’t want to just create a speed bump on the MacBook Pro,” he says. “In our view this is a big, big step forward. It is a new system architecture, and it allows us to then create many things to come, things that we can’t envision yet.” ... '
https://www.cnet.com/special-reports/does-the-mac-still-matter/

Since there would no speed updates on any other Mac ( an long overdue minor tweak on the MBA's memory earlier in the year. Apple eased off of being too Scrooge McDuck on RAM capacity. ). It has all the appears that nothing else deserved a speed bump either.

Yeah, in a way they were due for a major bump on the MBP. But the MBA ( Apple's economy, entry level priced ) system sure could have used a speed bump upgrade. It is a generation back for no good reason. It is the entry model so it doesn't need bells and whilstles and a "better than sex" video. It primarily needs to provide value. In the more price sensitive zone the MBA is in that is big problem.

Apple chasing increasingly higher average selling prices will have negative effects on the Mac ecosystem over the long term. ( that is another problem is yes that does fit the narrative in the video better. )
 
That is "NeXT" Steve not really the one who grew Apple the second time around. NeXT sputtered so he had not mastered products at that point either ( better than before but still off from where eventually got to) It isn't just "sales and marketing". One the diseases Apple has now is actually product design for product design sake. For example the $10K solid gold watch ... WTF is that as a technologic solution. Ive's crew probably milled and shaped gobs of money in the shape of gold working on the project.

The point isn't so much sales and marketing as much as out of balance. Marketing has a role, engineering has a role, design a role. Extremely skewed too far into any one of those is likely bad over the long term.

At this point the design team is "too snooty" to do speed bump upgrades. It has be the best, every, insanely great , revolutionary, incredible..... and have a 5 minus 50 shades of gray product induction video about of this new product is better than sex.

Here is an illuminating quote.

'... “We didn’t want to just create a speed bump on the MacBook Pro,” he says. “In our view this is a big, big step forward. It is a new system architecture, and it allows us to then create many things to come, things that we can’t envision yet.” ... '
https://www.cnet.com/special-reports/does-the-mac-still-matter/

Since there would no speed updates on any other Mac ( an long overdue minor tweak on the MBA's memory earlier in the year. Apple eased off of being too Scrooge McDuck on RAM capacity. ). It has all the appears that nothing else deserved a speed bump either.

Yeah, in a way they were due for a major bump on the MBP. But the MBA ( Apple's economy, entry level priced ) system sure could have used a speed bump upgrade. It is a generation back for no good reason. It is the entry model so it doesn't need bells and whilstles and a "better than sex" video. It primarily needs to provide value. In the more price sensitive zone the MBA is in that is big problem.

Apple chasing increasingly higher average selling prices will have negative effects on the Mac ecosystem over the long term. ( that is another problem is yes that does fit the narrative in the video better. )
Interesting perspective. How do you explain about Mac Pro 2013? For design sake? If so, they basically wasted time and money for nothing. Also, what is your thought about Apple secrecy? Other than keeping Apple floating, why aren't board members doing anything to change tim's vision...other than money.
 
Interesting perspective. How do you explain about Mac Pro 2013? For design sake? If so, they basically wasted time and money for nothing. Also, what is your thought about Apple secrecy? Other than keeping Apple floating, why aren't board members doing anything to change tim's vision...other than money.
I think we will start to see more "panic" from investors in the longer term. The iPhone 7 wasn't met with the annual fanfare that it has in past years, they got a bump in sales due to the competitors device being an explosive device. This last 'hello again' was a yawn (did you see the audience-they were bored, not engaged, no 'wow'-seriously go back and watch it, it was like a blind date, and the media were waiting for the emergency abort call from their room mate). The AirPod debacle (apple has often said in regards to releases "under promise, over deliver" really helps with customer relations/satisfaction). We are seeing the expectations lowered and if they continue this pattern it will be disasterous.

Additionally, Apple hasn't ever made the panels for their displays, they source the components and put them in their housing, more advanced boards etc. why couldn't the do that with the lg 4K and 5k panels to match the Apple stuff on your desk???

It'll be fun to see this thread in a year.
 
Interesting perspective. How do you explain about Mac Pro 2013? For design sake? If so, they basically wasted time and money for nothing. Also, what is your thought about Apple secrecy? Other than keeping Apple floating, why aren't board members doing anything to change tim's vision...other than money.
Tim is not a visionary. There is no committee in Apple. Steve has described very long time ago how Apple work, and it is always collaboration of multiple people on multiple projects. Between the departments they can even do not know what each department is doing.

Mac Pro design is an epitome of what Steve loved about consumer market:
 
The point isn't so much sales and marketing as much as out of balance. Marketing has a role, engineering has a role, design a role. Extremely skewed too far into any one of those is likely bad over the long term.

At this point the design team is "too snooty" to do speed bump upgrades. It has be the best, every, insanely great , revolutionary, incredible..... and have a 5 minus 50 shades of gray product induction video about of this new product is better than sex.

And I think nothing displays this problem at Apple better than the 16GB of RAM limitation and explanation from Schiller. Here we have a $2400+ laptop and we're stuck with the maximum RAM from what, 5+ years ago? Its beyond baffling to design a computer that prioritized reducing the height of the machine from 18mm to 15.5mm over a critical computing component (assuming here that increasing thickness slightly, would have meant a larger batter and acceptable batter life with the 32GB of RAM). In who's world does 2.5 mm and maybe up to 1/2 pound matter more than having sufficient RAM for this premium computer to actually function properly over its lifetime?

Clearly the design and marketing crew called the shots here and the engineering teams just did what they could. At some point I'm sure there was testing of 2x16 DDR4 and it showed battery life off a marketing team target, so it was axed. That or Schiller is just blowing smoke and using DDR4 at all would have left the cheap DDR3 in other models too noticeable, and this is entirely a profit driven choice.

At any rate, is it completely impossible for Apple to release a new product that isn't lighter and thinner! than the previous design? At some point we have to stop right? Is the goal to make MacBooks as thin as iPads by 2020? Why?
 
At any rate, is it completely impossible for Apple to release a new product that isn't lighter and thinner! than the previous design? At some point we have to stop right? Is the goal to make MacBooks as thin as iPads by 2020? Why?
And that's the biggest issue that keeps popping up. Notice each time that they play a presentation video, how many of them are narrated by Jony Ive. He seems to be calling the shots with his design miniaturization expertise. If this is the best he can do, his participation at Apple need to be miniaturized out of existence.
[doublepost=1477850061][/doublepost]
I believe this is why Apple will remain stagnated.
It's going beyond stagnation and into constipation
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.