Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Those of you who strongly prefer Nvidia GPUs, is it because of CUDA? If so, what applications only support CUDA and not OpenCL on OS X? From a developers perspective, it seems like only cross platform tools would fall in this category, as no OS X developer would require something that does not ship in any current macs.

YES. It's all about the CUDA acceleration.

I was strictly a Final Cut Pro + After Effects + ProTools media producer. When Apple replaced FCPro 7.0.3 with FCPro X and then support for FCPro 7.0.3 stopped with 10.6.8, I had a choice. While I resisted switching to a PC platform and stayed with OS X and cMP, I switched to the full Adobe CC Suite and Premiere Pro. As GPUs were tasked more and more to accelerate some functions, nVidia CUDA has been the best choice for Adobe software. I'm glad both Adobe and nVidia continue to support my needs in ways Apple has neglected.
 
These developers have abandoned Apple OSX because of Apple's "kill CUDA" mentality.

Apple doesn't have a "kill CUDA" mentality, Apple has a "pay no more than this much to make a computer" mentality. AMD are in the range now because they're willing to do custom form factors for less (in the nMP 2 grand per machine, less) than nVidia. That is the beginning, middle, and end of the story.

Mac hardware development isn't hand in glove with software the way iOS hardware is, because so much of it is out of house - marketing gives hardware a budget, hardware makes the best it can with that budget, then software does the best it can with what hardware gives it. If nVidia had a cheaper solution than AMD for custom form factors, they'd be in the machines and the zeitgeist would be all about how Apple custom tailors software, especially FCPX, for nVidia hardware.
 
  • Like
Reactions: ManuelGomes
Apple doesn't have a "kill CUDA" mentality...

Perhaps, but Apple certainly doesn't mind releasing something as small as a Security Update which wreaks havoc with nVidia drivers. So, as an nVidia and Apple user, I delay the benefit of the Apple security update until it's supported by an nVidia driver update. How does that benefit the Apple universe? It doesn't yet it still happens.
 
How does that benefit the Apple universe? It doesn't yet it still happens.

Sometimes a security update, is just a security update. Don't confuse "we're not going to test against this" with "we want this gone". Indifference is far more likely than a grand conspiracy. The nVidia thing is complicated by the fact that until recently (for whatever reason, necessity or convenience), nVidia were making the drivers in a way that directly contradicted Apple's guidelines - writing into /System, and it seems that part of their change to doing drivers "correctly", (perhaps to keep within a QA overhead) is signing the driver to only work upto a specific system build, so they never have customers with cards malfunctioning because they've got newer OS builds than the QA department.

I'd hazard a guess that QA department for the Mac is maybe 1 person part-time.

Think about it this way - if a printer or scanner driver fails with a system update, you can't print or scan. When your graphics driver fails, perhaps you can't even see the screen to figure out what's happening, it's better they're being cautious IMHO.
 
These developers have abandoned Apple OSX because of Apple's "kill CUDA" mentality.

Furthermore, Apple has alienated these developers by staying on OpenCL 1.2. The rest of the world has already OpenCL 2.0 available. So if you really want up-to-date cross platform technology, CUDA is the only proven option on OS X right now. (And with Metal the cross platform gap even widens.)
 
Perhaps, but Apple certainly doesn't mind releasing something as small as a Security Update which wreaks havoc with nVidia drivers. So, as an nVidia and Apple user, I delay the benefit of the Apple security update until it's supported by an nVidia driver update. How does that benefit the Apple universe? It doesn't yet it still happens.

NVidia breaks themselves automatically if the build number changes. That was totally Nvidia's fault, not Apple's. It's a "feature" Nvidia added.

Apple shouldn't be expected to tiptoe around other developer's "features" like that. It would be crazy to think Apple should not be able to update their own build numbers or version numbers without Nvidia's permission.

As far as killing CUDA, Apple never supported it to begin with. They're not going to have an NVidia option just for CUDA when it's not an officially supported technology and it never has been. It's like saying you won't buy a Mac Pro
because it doesn't come with a Soundblaster because you have some app that is Soundblaster accelerated. That's ok, but you can't expect Apple to throw in every bell and whistle for any random technology in the world that isn't officially supported.
[doublepost=1454318332][/doublepost]
Furthermore, Apple has alienated these developers by staying on OpenCL 1.2. The rest of the world has already OpenCL 2.0 available. So if you really want up-to-date cross platform technology, CUDA is the only proven option on OS X right now. (And with Metal the cross platform gap even widens.)

"Up to date" is relative. CUDA hasn't changed much in a long time.

Nvidia also only supports OpenCL 1.2 on Windows, so if you're an OpenCL developer, the lack of OpenCL 2.0 is disappointing, but if you can't use it universally on Windows either, not that big of a deal.

If Nvidia doesn't have a 2.0 driver at all, it could be a reason Apple hasn't implemented it yet, and maybe another reason they are busy ridding themselves of Nvidia based Macs. NVidia really doesn't seem to care about compute.
 
Last edited:
Nvidia also only supports OpenCL 1.2 on Windows, so if you're an OpenCL developer, the lack of OpenCL 2.0 is disappointing, but if you can't use it universally on Windows either, not that big of a deal.

If Nvidia doesn't have a 2.0 driver at all, it could be a reason Apple hasn't implemented it yet, and maybe another reason they are busy ridding themselves of Nvidia based Macs. NVidia really doesn't seem to care about compute.

Oh, I didn't know this. I simply expected that nVidia would support the latest OpenCL release on Windows and Linux. I never used anything higher than OpenCL 1.2 because the Mac is my primary development platform. I just assumed that the grass would be much greener on the other sides of the fence. ;-)
 
Oh, I didn't know this. I simply expected that nVidia would support the latest OpenCL release on Windows and Linux. I never used anything higher than OpenCL 1.2 because the Mac is my primary development platform. I just assumed that the grass would be much greener on the other sides of the fence. ;-)

There's a sense that Nvidia is intentionally dragging their feet because their compute performance is not good, and they're hiding that by locking people in with software. If Nvidia's compute scores don't look so good when they're benched against AMD, they'll drag their feet on any technology that makes that comparison possible. That's one big reason CUDA is so bad for the industry. It's letting Nvidia get by with shipping bad product because CUDA users are locked into Nvidia's hardware. It's like how Internet Explorer 6 users thought Internet Explorer 6 was better because Microsoft had proprietary API that made web sites only work on Internet Explorer. CUDA is basically the Internet Explorer 6 of the compute world.
 
  • Like
Reactions: mrxak and koyoot
No sweat, I'm curious to see how it really performs.
[doublepost=1454356960][/doublepost]It seems NVidia wants Pascal to come out ahead of Polaris, let the war begin...
 
There's a sense that Nvidia is intentionally dragging their feet because their compute performance is not good, and they're hiding that by locking people in with software. If Nvidia's compute scores don't look so good when they're benched against AMD, they'll drag their feet on any technology that makes that comparison possible. That's one big reason CUDA is so bad for the industry. It's letting Nvidia get by with shipping bad product because CUDA users are locked into Nvidia's hardware. It's like how Internet Explorer 6 users thought Internet Explorer 6 was better because Microsoft had proprietary API that made web sites only work on Internet Explorer. CUDA is basically the Internet Explorer 6 of the compute world.

My "baloney" alarm just went into full "Oscar Meyer" mode.

Nvidia is "shipping bad product"? Why is it only you (and a couple of AMD shills) that post this? Nobody else.

The internet isn't crawling with complaints about Nvidia, far more about AMD. Though now that winter is here their "Space Heater" line is a little more popular. (That might not be the name anymore, they rebadge so often you need a program to keep track.)

You guys all keep mentioning a rosy future where AMD creates a new World Order via asynchronous compute and magical fairy dust. For those of us living in the here and now, Nvidia still makes the better product.

It will be GREAT if AMD becomes competitive again. But all these promises of incredible leaps and bounds just around the corner, it's like SOMEONE wants to knock the Nvidia building over. Shows up with a truck labeled "World's most powerful Bulldozer" but when the truck is opened there is just a glossy press release with "Coming Soon" hastily scribbled at the bottom.
 
The internet isn't crawling with complaints about Nvidia, far more about AMD. Though now that winter is here their "Space Heater" line is a little more popular. (That might not be the name anymore, they rebadge so often you need a program to keep track.)

The internet isn't sprawling with complaints because people don't know better. If you can't run CUDA applications, Nvidia could ship absolutely anything for compute, and the Nvidia crowd would be singing it's praises even if it was complete garbage because you can't run CUDA on AMD gear.

The whole reason the CUDA crowd can pretend they have better performance is because Nvidia has created a situation where no one can bench against them. That's why when you bench OpenCL on Nvidia hardware, they get creamed. They're shipping an inferior compute product, but they're locking the market into that inferior product with CUDA, and creating a situation in which they can't actually be benched against AMD.

It doesn't matter if Nvidia is actually shipping worse hardware. The people in this thread will still buy Nvidia because they're locked into CUDA, and they'll be saying "please" and "thank you" the whole way.

It's going to look really bad for Nvidia if AMD's CUDA -> OpenCL converter ends up producing faster output on AMD gear, and I think that's actually likely to happen.
 
  • Like
Reactions: mrxak
This is what will happen when AMD aims at Nvidia. For some strange reason AMD users think Nvidia is not moving forward!
Houston, we missed the Nvidia target but hit the foremans personal vehicle! :(
AMD.png
 
  • Like
Reactions: Bytehoven
If and or buts were candy and nuts....

Oh c'mon.

Here's the inconvenient truth: CUDA and OpenCL bench about the same on Nvidia hardware. Depending on the algorithm one might be slightly faster than the other, but in aggregate they're the same speed. There isn't any secret sauce in CUDA that makes it run faster.

So if there isn't anything special about CUDA that makes it run faster than OpenCL, and AMD is beating Nvidia on OpenCL performance, what would that mean about AMD and Nvidia's performance in general?

The only reason Nvidia is continuing on the CUDA path instead of backing OpenCL because they wouldn't look good on a level playing field.
 
  • Like
Reactions: mrxak
Add to that the HSA 2.0 which is on Nvidia hardware done through... CUDA.

Everything for Nvidia comes from software. There is no hardware scheduling, because the GPUs only have MegaThread engine. It is dependent on software. Nvidia did this(got rid of hardware scheduling) because Fermi was hot, and inefficient, exactly because it had hardware scheduler. But lets look at the world of low-level access to GPU(Gaming, VR, future Pro Applications). What will happen if you have to access the GPU that way, where there is no hardware on hardware level to know what to do with application? You have to code WHOLE application for specific hardware. Today Nvidia is giving support for devs with their knowledge and engineering. With drivers. That is all their power. Unfortunately for Nvidia things started to turn up a bit.

Many of you think that I am AMD fan. I was always criticizing Nvidia hardware for the same reason you criticized Mac Pro - for not being future proof.
 
Apple doesn't have a "kill CUDA" mentality, Apple has a "pay no more than this much to make a computer" mentality. AMD are in the range now because they're willing to do custom form factors for less (in the nMP 2 grand per machine, less) than nVidia. That is the beginning, middle, and end of the story.

There is every reason in the world to think that Apple is thinking about more than just price, at least short-term price. Apple doesn't want to live in a world where only Intel and nVidia exist as parts suppliers. Even if Apple wants to use nVidia and Intel hardware, they want those companies to have competition to keep prices lower and performance gains going, long-term. Apple had enough trouble in the PPC days and they are not interested in getting into that same situation again.

I also think Apple very much does want to kill CUDA. They'd much prefer OpenCL be king so that they and their developers can make portable code. Apple doesn't want to get locked into CUDA any more than their customers do. If you've already gotten yourself locked into CUDA, I sympathize, but you should be pressuring your software suppliers to port over to OpenCL ASAP.

So, if Apple can send some money AMD's way to keep them alive, and help kill off CUDA in the process, that's win-win for Apple, and a win for Apple's customers too. And why do you think AMD is so willing to do custom form factors for less, anyway? AMD is grateful for Apple's support.
 
  • Like
Reactions: Hank Carter
I wouldn't say all developers have abandoned OSX.

Blizzard Entertainment is building the new version of World of Warcraft using Metal.

And yet for the first time in Blizzard's history they're releasing a new IP that doesn't support OS X. Not exactly comforting knowledge.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.