Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Had a brief scare while updating from the beta drivers to final.
While Mac was rebooting after installing, suddenly its power clicked off like a light in the middle of the loading bar.
Immediate gut reaction: "Great, the update just killed my computer!"
Luckily however when I turned it back on all was fine. The heck was that about?
 
To summarize and avoid any misunderstanding, it means those who have a MVC flashed cards running with 10.0.4 can update to 10.0.5 straight away and after update the nvidia webdriver, right?
Right. If you are on 10.10.4 and using app store to update or using delta update from apple website.
 
  • Like
Reactions: registudio
Job done successfully with my Titan X.
I just had to restart manually the Mac pro as it was stuck for restarting automatically, but that's ok not a big deal.
 
No sign of release of M6000 Quadro for Mac yet so I don't know if Maxwell will come to the Mac. We lucky Nvidia uses the unified driver code base that allows Maxwell to run a Kepler driver. If we had native support I'm sure the benchmarks would be closer to Windows. Especially missing is real time data compression that allows a 256bit bus on Maxwell to perform as well as a 384bit bus on a Kepler, AMD, etc
 
This should not require a 'source' if you have tracked Nvidia driver development since the old days and remember when UDA was announced.

http://www.nvidia.co.uk/object/feature_uda.html

The compression and caching features of Maxwell do not exist in Kepler and Nvidia says nothing about any form of Maxwell support in these drivers. GTX 980/970/Ti/Titan X only work off the Unified Driver Architecture (UDA) in the Mac web drivers. Every time a new generation Nvidia chip comes out it is UDA compatible (unless the device IDs have been removed). On top of the UDA are the specific software drivers for new features such as compression algorithms Maxwell introduced, but which won't be in the web drivers and judging from the benchmarks it's clear they aren't there. Barefeats shows how similar Kepler and Maxwell are running on a Mac. It's only the clock speed making a a little difference. If you have true Maxwell support you would see much better results. Before anyone asks for 'proof' that Maxwell code isn't there they should prove it is there first.
 
This should not require a 'source' if you have tracked Nvidia driver development since the old days and remember when UDA was announced.

http://www.nvidia.co.uk/object/feature_uda.html

The compression and caching features of Maxwell do not exist in Kepler and Nvidia says nothing about any form of Maxwell support in these drivers. GTX 980/970/Ti/Titan X only work off the Unified Driver Architecture (UDA) in the Mac web drivers. Every time a new generation Nvidia chip comes out it is UDA compatible (unless the device IDs have been removed). On top of the UDA are the specific software drivers for new features such as compression algorithms Maxwell introduced, but which won't be in the web drivers and judging from the benchmarks it's clear they aren't there. Barefeats shows how similar Kepler and Maxwell are running on a Mac. It's only the clock speed making a a little difference. If you have true Maxwell support you would see much better results. Before anyone asks for 'proof' that Maxwell code isn't there they should prove it is there first.

I know what the hardware feature is, I'm asking for your source that this feature isn't enabled on the OS X drivers? Sounds like you're just assuming that it's not enabled based on performance results?
 
I know what the hardware feature is, I'm asking for your source that this feature isn't enabled on the OS X drivers? Sounds like you're just assuming that it's not enabled based on performance results?
It's not enabled because it doesn't contain any Maxwell code. If it did then the Quadro M series for Mac would be released months ago. Again, if people are going to assume there is specific Maxwell optimisations in the drivers they need to prove it instead of asking people 'prove there isn't'. Our cards work because of UDA only. Yes, benchmarks also prove that there are no Maxwell optimisations. Just higher clock speeds.
 
It's not enabled because it doesn't contain any Maxwell code. If it did then the Quadro M series for Mac would be released months ago. Again, if people are going to assume there is specific Maxwell optimisations in the drivers they need to prove it instead of asking people 'prove there isn't'. Our cards work because of UDA only. Yes, benchmarks also prove that there are no Maxwell optimisations. Just higher clock speeds.

If the OS X drivers didn't include any Maxwell code, no Maxwell GPU would work at all. You're assuming a lot of knowledge about the internals of the OS X driver codebase compared with other OSes. The OS X drivers aren't the full UDA that is used for Windows OpenGL, due to the different driver model that Apple uses (i.e. much of the driver stack is implemented in the Apple OpenGL framework, and the hardware drivers implement the bottom half). I get that 980 Ti perf isn't as good as it could be, but I'd phrase that as the Windows D3D driver has more optimizations than the OS X OpenGL driver for these cards.

I also highly doubt there will ever be an official Mac Edition card, why would they target a Mac Pro from 5 years ago?
 
If the OS X drivers didn't include any Maxwell code, no Maxwell GPU would work at all.

You ignored UDA after it was explained to you. It even says on the link I provided 'forward compatibility'. They possibly could add the device ID and that's all that would be needed to get it to run, but it wouldn't be optimised. If you now say the 'full UDA' isn't implemented then that's wrong too. It's implemented which allows everything from GT120 to Titan X (8 generations if we don't include GeForce 8 series) to run on the same base. Since you assume 'full UDA' isn't implemented but Maxwell code is implemented, then you are taking two positions without any evidence. I took a look at that long Nvidia driver thread and UDA isn't even mentioned once because until I came along nobody appears to have heard of it. Therefore I don't want to debate further. Take the information or leave it.
 
You ignored UDA after it was explained to you.

Actually, I didn't. There's a reason that NVIDIA releases a new driver along with each new GPU, because it has to update the driver code to support it. The UDA means they deliver a single binary that supports all currently-supported GPUs along with the new one. That is, the same driver bits will run on a GTX 680 and a GTX 980 Ti. That doesn't mean the GTX 680 code magically makes the 980 Ti work, it just means there's a single driver download that works on both cards (and all the other ones that are currently supported). Over time, older GPU architectures get phased out, and new ones keep getting added. UDA means there's just a single driver, not a different binary for every specific GPU (i.e. the 680 driver is a different download to the 980 Ti, like some other vendors had/have).

My basic point is that you're assuming a lot of internal knowledge about how the OS X driver from NVIDIA works and how much is shared with other OSes and other APIs like DirectX. The latest driver for OS X improved perf on Maxwell cards by a lot. Hopefully those improvements will continue coming over time. If you want to blame the lower-than-Windows performance on the lack of Maxwell compression, we can agree to disagree.
 
Last edited:
I think we would see greater improvements in Nvidia performance if they didn't have to fight Apple.

If Apple just wanted their computers to have the best performance instead of being political about it, the drivers would be in the OS and the time Nvidia spends having to have 2 separate drivers co-existing in OS X could be used to make one driver sing.

I am also amazed that they still bother. Apple has been making them sleep on the front porch for awhile now.
 
  • Like
Reactions: Synchro3 and Zorn
I'm sure financial considerations also play a considerable role in Apple's choice of GPUs. In a perfect world, Apple would make GPUs from both vendors available on all models of Macs.

Or at least adopt the current iterations of nVidia hardware since they vastly outclass ATI's trash.
 
I'm sure financial considerations also play a considerable role in Apple's choice of GPUs. In a perfect world, Apple would make GPUs from both vendors available on all models of Macs.

This is the correct answer. There's a reason you keep seeing ATI/AMD graphics in things like new Macs, consoles, etc. ATI is providing their second rate GPU's at bottom dollar prices in order to try and keep their name relevant and hang on to the embedded niche of the market. nVidia doesn't have to play this game because anyone that actually cares about performance/thermals will just go buy an nVidia GPU themselves.

The only way many end users consider ATI chips these days is with a "Hmm, I'm saving so much money with all of these rebates - I guess I can put up with the performance."
 
After installing I can't switch to the web drivers. I go to the Driver Manager Preferences, and then click the lock, put in my password and switch to the web drivers. It prompts a restart, after which it's back to the native OS X drivers. What's going on?
I'm on 10.10.5 running a Quadro4000 in a Mac Pro 4.1

Thanks!
 
After installing I can't switch to the web drivers. I go to the Driver Manager Preferences, and then click the lock, put in my password and switch to the web drivers. It prompts a restart, after which it's back to the native OS X drivers. What's going on?
I'm on 10.10.5 running a Quadro4000 in a Mac Pro 4.1

Thanks!

Manually add nvda_drv=1 to your boot-args using terminal.
 
^^^^Since the 2012 iMac came with Nvidia Graphics, I don't see why not. Download and install it, run a couple of graphics tests (Heaven or valley) and see what the results are. I doubt they'll be much, if any, difference. I would however download and install the Cuda Driver.

Lou
 
^^^^Since the 2012 iMac came with Nvidia Graphics, I don't see why not. Download and install it, run a couple of graphics tests (Heaven or valley) and see what the results are. I doubt they'll be much, if any, difference. I would however download and install the Cuda Driver.

Lou

Can you help me understand why you're so confident?

Oh, and why are the CUDA drivers offered separately??

Lastly, to answer why I'm looking into doing it at all, is that I read on a forum that someone had their gameplay performance improve significantly in Metro Redux by doing so.

As both games are literally unplayable for me now, I'm desperate. :)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.