Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Flint Ironstag

macrumors 65816
Original poster
Dec 1, 2013
1,334
744
Houston, TX USA
Just caught this on Gizmodo. The next CUDA release will be the final (famous last words!) to support MacOS.

https://gizmodo.com/apple-and-nvidia-are-over-1840015246

I am sad to see this, as it's much easier to cool 3 Nvidia GPUs in a tower than it is AMD - they throttle under my workloads (amongst other issues). Guess we'll see what their next generation of compute cards look like. And Intel should have something in 2020...
 
Well, not necessarily. NVIDIA could cave and choose to only support Metal on macOS. So it’s still a possibility. Doubtful, but possible. It’s a dang shame.
 
  • Haha
Reactions: LeeW
Well, not necessarily. NVIDIA could cave and choose to only support Metal on macOS. So it’s still a possibility. Doubtful, but possible. It’s a dang shame.

There never was CUDA support on iOS or iPadOS ( or tvOS) devices. So what was the other Apple platform that Nvidia supported other than macOS? So if Nvidia says they are dropping "Apple" support. Pragmatically, that is macOS because that was the only place they were.

The quote from Nvidia (from the gizmodo article )
"..CUDA 10.2 (Toolkit and NVIDIA driver) is the last release to support macOS for developing and running CUDA applications.” .."

So it isn't a "not Apple ( but maybe macOS )" thing. macOS got named explicitly.


What is more likely is that they didn't want to support Metal on macOS in the manner in which Apple has laid out. ( Metal wouldn't be some 2nd class citizen to CUDA that Nvidia would prioritize behind their CUDA objectives. ). So that is really the core of the impasse. Apple controls the kernel and driver signatures. ( so it is their "house" and "house rules"). Similarly the kernel security policies and driver structure is about the change. If not gong to align with that they relationship is basically over anyway. ( non compilant stuff won't get a signature. )


The notion that Nvidia would be "supporting" macOS if they had some drivers that involved hacks to get installed. ( or even more brittle than the "halt and catch fire in the presence of new macOS version" like drivers they were drifting toward over the last couple of years. It "happens with work with these hacks" isn't "support" in the normal industry sense of the word. Nvidia going 100% rogue driver developer means, they'd have zero chance at getting a signature and no joint bug resolution support.
 
Last edited:
People are way too harsh on AMD cards on this site. I had a 2010 Mac Pro that came with the AMD card. I put in a GTX 980 and I got far worse performance even though the card was better on paper. Not everyone needs CUDA. Not everyone buys these systems for gaming. These are not marketed as CUDA systems or gaming computers. People need to stop treating it as if Apple is saying you can play the next generation of Crisys games on it in max settings at 8K resolution with AMD cards.

You get the system you need to perform your task. You need CUDA? Get Windows and save a lot of money. You need a free NAS setup? Get FreeNAS or NoRAID which are linux based. Need to use Final Cut Pro/Logic Pro/Garageband? Get a Mac. Its that simple. I use all three operating systems and multiple computers each with their own needs.
 
There never was CUDA support on iOS or iPadOS ( or tvOS) devices. So what was the other Apple platform that Nvidia supported other than macOS? So if Nvidia says they are dropping "Apple" support. Pragmatically, that is macOS because that was the only place they were.
You know what CUDA is, right? That's the only thing that NVIDIA said they'd stop supporting. So it either means 1) They're dropping support for NVIDIA cards altogether in macOS, or 2) They're moving to Metal for their cards on macOS
We shouldn't jump to conclusions. But unfortunately, yeah, it might be true that they're just dropping Apple support altogether. :( We can speculate that it's true, but shouldn't jump to the conclusion yet.
 
CUDA 10.2 (Toolkit and NVIDIA driver) is the last release to support macOS for developing and running CUDA applications. Support for macOS will not be available starting with the next release of CUDA.
 
I thought the title read:
no more Apple + Color. The color correction app that was once in the top three best. The company bought to compete with other grading apps and then Apple blew off!
no more Apple + Shake. The compositing app that was once in the top three best. Nothing's Real is the company bought by Apple to compete with Nuke by The Foundry. Blew it off by trying to base a new version of Shake (nodal) to be called Phenomenonon on Motion (layers). Apple was laughed out of that forum! The people from Shake left Apple and went to Nuke.
Yes I have written this before but the pattern is now more obvious. But there is one company in which Apple seems to be able to control and has on a short leash!
Short Leash.png
 
Last edited:
  • Like
Reactions: aaronhead14
You know what CUDA is, right?

There are two aspects of CUDA. One is the computational tool. Second, it is often used by Nvidia as a hook build a bigger moat around their hardware. Sections 3.2 and 3.3 of their programming guide outlines interoperatbiliity with OpenGL , DirectX / Direct3D (up through 12) , and Vulkan ( but no Metal ... which has been out for as long as Vulkan . Nvidia has put lot of efforts into telling folks that they can sometimes skip OpenGL SL , MS computational API , and OpenCL to get to best performance by way of CUDA.

There two levels to CUDA. One, is the library and app level compiler language level (CUDA Runtime). However, there is also a low level aspect to CUDA (CUDA Driver). It is the latter were pragmatically GL SL , OpenCL , or some other computational code is going to come in and interact. The "closer to general purpose" computation code and shader code are all going to share the same base level resources so there are going to be some common control and compose(compile) elements there. CUDA's Runtime model is a bit more broader, general than Metal ( which started out more shader language replacement focused ).

It isn't a completely clean silo with zero interactions with other parts of the graphics stack.

That's the only thing that NVIDIA said they'd stop supporting. So it either means 1) They're dropping support for NVIDIA cards altogether in macOS,

Pragmatically, 1) is already true. How releases have they done on Mojave? On Catalina? It's been over 12 months since Mojave launched and nothing. How is that is that in the active supported status? At the point of being two versions back on the OS , that is pretty much dropping the ball on support.


or 2) They're moving to Metal for their cards on macOS

They haven't moved at al in terms of the contemporary versions of macOS. That would more so be a restart because they haven't been moving at all for long while now. [ It is akin to a Windows driver developer saying "Oh, guess should start moving to supporting DirectX." as if that was an option for a committed, dutiful partner of Windows development. ]

I suppose it is possible that Nvidia would leave their protective moat behind. The amount of money they throw into constructing that on multiple platforms though makes that unlikely. But yeah, it isn't completely impossible.


We shouldn't jump to conclusions. But unfortunately, yeah, it might be true that they're just dropping Apple support altogether. :( We can speculate that it's true, but shouldn't jump to the conclusion yet.

Observations (that they aren't in a support status now) isn't really a conclusion (as in some long chain of inference or speculation ). This new CUDA stuff is going to old instances of macOS ( which are now in at best major security bug fix only status. ). From a Nvidia first view of the world, that is "on last" macOS support. From a macOS perspective though that is pretty far off.
 
I really don't know the source of the rift between Apple and nVidia, I just know it's bad for end users not to have choice.

You would think, with the new mMP giving us PCI-E slots again, Apple would cave to allow end users choice, nVidia must have pissed someone at Apple off more than Trump at an immigration rally.
There's always hope on the Intel front.
 
There's always hope on the Intel front.

In the Mac Pro context, not short term help. From Intel's annoucements so far and some leaks it appears they are taking a two pong approach to roll out that makes lots of sense for them but probably won't help most of the upper end of Apple's desktop line up.

There are multiple micro-architectures they are persuing with one overall general architecture label of Xe. There is Xe-Lp ( low power) , Xe-HP ( "high power" ) , and Xe-HPC ( "high performance computing"). Intel is working from the outside to the middle of that grouping. The Xe-LP should arrive in 2020. Xe-HPC well into second half of 2021.

So in terms of Mac desktop systems. The Mini and 21.5 Mac are the more likely intermediate term candidates. But yeah in 2022 a scaled down to two Xe-HPC packages version could be a candidate for a Mac Pro.

There may be a short term gap, but if Intel doesn't screw up the implementation, there should be two possible vendors. ( and Intel will probably be 'hungry' to get more sales so will get with the Metal program. ). The new instruction set is grounded off the iGPU one they have now so probably have a paritially working Metal implementation already for it on first test dies from the fab.
 
  • Like
Reactions: Flint Ironstag
While I know folks who kinda depend on CUDA, I have no current functional need. BUT I do have a GTX 980, which is a stellar performer for what I need. Yes I know I am also stuck at High Sierra for macOS, but I am at peace with that (i.e. is suits what I currently do). HOWEVER, as I also depend on nVidia "web drivers" that seem to need an update for every minor security upgrade the OS gets (they seem tied to the OS build number). My bet is there will be a few more HS "security updates" so naturally this is of concern. Anyone have a clue what nVidia will do for this?
 
Is there a reason Nvidia should bother with Apple compatibility? I understand that for those with pre-2013 Mac Pros, this can be a major thing. For those who may buy the new Mac Pro or use eGPU, it can still be a thing. But as important as it is to those users, they're still an edge case within Apple's 10% share of the PC market (edge case within edge case).

It doesn't come down to the sale of a relative handful of high-end GPUs, it comes down to whether their lower-level GPUs are present in mass-market Macs, which they are not. There's little point of rehashing why that is.

The fact is, it's been that way for a decade in iMacs (last Nvidia-equipped iMac was Late 2009. The last Mini was Mid 2010, last factory-equipped Mac Pro Early 2009, MacBook Pro Mid 2010, MacBook Air Late 2010, and last MacBook Mid 2010. All of these are on Apple's Vintage/Obsolete list (and nearly all Obsolete).

The highest version of macOS officially supported on any of those machines is 10.13.6 (even lower for Mac Pro). Why should Nvidia bother to produce drivers for OSes that are not supported on the machines containing their GPUs?
 
Please at least get the facts straight for anyone who is "new" to this.

Fermi and Kepler Apple-issued NVIDIA GPUs (like those in MacBookPros) and officially released GPUs for Mac Pro (like GTX 680 Mac Edition) are fully supported BY APPLE and work with APPLE'S OWN DRIVERS that are part of macOS Mojave and Catalina.

These drivers are NVDAGF100Hal.kext and NVDAGK100Hal.kext and present on every machine with macOS installed.

NVIDIA Web Drivers added support for additional GPUs (not issued by Apple) through the Pascal generation (and/or Volta if you use the pulled .108 driver).

This latest announcement impacts CUDA on macOS only, which hasn't worked with anything beyond High Sierra anyway. High Sierra will likely end frequent security update support in late 2020 anyway. The processors in MP5,1 and previous are already EOL'd by Intel. It's getting closer and closer to setting machines up for legacy support if you're relying on NVIDIA and CUDA on macOS.
 
NVIDIA GeForce GT 750M in MacBookPro11,3 (MacBook Pro Retina, 15-inch, Late 2013) and this machine is officially supported for Catalina by Apple without "hacks" involved.
Yes but this is Kepler without web drivers or Cuda Kits in Mojave and Catalina. All Kepler cards work in Catalina with the Apple drivers. I think what Nvidia means is that there will be no further development/support for Cuda which we already knew since Mojave. I don't see any news here, do you?
 
While I know folks who kinda depend on CUDA, I have no current functional need. BUT I do have a GTX 980, which is a stellar performer for what I need. Yes I know I am also stuck at High Sierra for macOS, but I am at peace with that (i.e. is suits what I currently do). HOWEVER, as I also depend on nVidia "web drivers" that seem to need an update for every minor security upgrade the OS gets (they seem tied to the OS build number). My bet is there will be a few more HS "security updates" so naturally this is of concern. Anyone have a clue what nVidia will do for this?
Why not buy RX 5700 XT which will run circles around GTX 980, and will work, most likely, out of the Box on Mac, when we will see Macs with Navi GPUs?
 
I don't see any news here, do you?

There is no (new) news here. Basically just the official announcement that CUDA on macOS is dead or barely clinging onto life support. Also the announcement that CUDA (and toolkit) version 10.2 is now available, and it will be the the last and final update for macOS (only if you're on High Sierra).

I'm personally shocked they even released this update. NVIDIA support previously said CUDA 10.1 would be the last version for macOS, but they obviously felt the need to make a public announcement to at least acknowledge the elephant in the room.

If you're on macOS, embrace Metal, move to another platform, or prepare for transitioning your machine(s) to legacy modes. That writing has been on the wall since Mojave was released.
 
Well, not necessarily. NVIDIA could cave and choose to only support Metal on macOS. So it’s still a possibility. Doubtful, but possible. It’s a dang shame.

The (Mac) market simply isn’t large enough for the CEO (Huang) to “cave” to anyone. He continues to think he is in the cat bird’s seat and there are just too many egos involved. Personally, I don’t care for NVIDIA, so this suits me just fine. I would rather Apple continue to invest in AMD and for them to continue to challenge Intel and NVIDIA on the CPU and GPU front. We already see how Intel is more than happy to abuse its market position and NVIDIA will do the same without actual competition. AMD is really the only competition these two companies have.

I am mystified that people think AMD CPUs in a Mac are a good thing because Intel does a lot of sneaky crap, and then on the flip side, pine for NVIDIA GPUs, when NVIDIA does a lot of sneaky things as well.
[automerge]1574702312[/automerge]
didn't the original retina MacBook Pro have an Nvidia card too?
GT 650M
 
Basically the Mac platform is confirmed dead for many scientific users, particularly in machine learning/AI.

The advantage of CUDA is the existing software base and portability from desktops to datacenter. Metal simply isn't available on servers or on cloud computing providers, so it's not an option.

Windows has nicely taken over, with Microsoft's Windows 10 push for Linux and command line support, plus far better workstation and hardware support.

Apple has decided the only true pro user they care about is video production.
 
Basically the Mac platform is confirmed dead for many scientific users, particularly in machine learning/AI.

The advantage of CUDA is the existing software base and portability from desktops to datacenter. Metal simply isn't available on servers or on cloud computing providers, so it's not an option.

Windows has nicely taken over, with Microsoft's Windows 10 push for Linux and command line support, plus far better workstation and hardware support.

Apple has decided the only true pro user they care about is video production.

CUDA is dying other than hobbyist ML. The big cloud players are all moving to ASIC (e.g. Google TPU). CUDA was a half-decent interim step but making a GPU do ML/AI tasks won't compete with purpose built silicon.
 
  • Like
Reactions: Zdigital2015
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.