Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

SoyCapitanSoyCapitan

Suspended
Original poster
Jul 4, 2015
4,487
2,551
Paris
Nvidia has finally mentioned it has included beta support for Maxwell in its most newest drivers. I've been trying to tell some people for a while that Maxwell support wasn't included in drivers until now and that our dear 9 series cards have been running on Nvidia's Unified Driver Architecture (UDA) which is forward compatible, especially for reference design graphics cards. Now Nvidia has confirmed it with their latest blog post as well as promising big performance gains henceforth. Enjoy, but bear in mind that at this stage the performance benefits are for mostly for Kepler GPUs. Maxwell should see Windows-like gains later.


http://blogs.nvidia.com/blog/2015/08/31/mac-driver/

'Our new driver for the Mac Pro offers up to 80 percent improved performance for Macs with Kepler GPUs. And, for the first time, our driver includes beta support for MacBook Pros and iMacs with Kepler GPUs, as well as beta support for those using Maxwell GPUs in older Mac Pro systems.'

In related news, the current El Cap beta driver continues to work in Public Beta 6. Thanks to @flowrider for being first to test.
 
Last edited:
  • Like
Reactions: Synchro3
I'm not sure that is what that blog post is stating.

It is referencing the driver for 10.10.3, several releases ago.

The bits about Beta support for iMac and MBPs I think is in reference to the fact that it no longer runs a machine id check to only run on MP 3,1/4,1/5,1.

There is some undisclosed political maneuvering going on, and the "read between the lines" is up to interpretation.
 
I'm not sure that is what that blog post is stating.

It is referencing the driver for 10.10.3, several releases ago.

The bits about Beta support for iMac and MBPs I think is in reference to the fact that it no longer runs a machine id check to only run on MP 3,1/4,1/5,1.

There is some undisclosed political maneuvering going on, and the "read between the lines" is up to interpretation.

They reference the driver released on 18 August.

I'm sure there is always some manoeuvring going on backstage to win Apple back. We'll see how AMD compares once the Nvidia drivers mature and exit the beta stage for Maxwell.

Did you have any luck getting the Fury to finally boot into OSX?
 
Yes, it's the Web Driver released on 8/18/15 for Yosemite 10.10.5 and is point specific. It is well known that it supported both Kepler as well as Maxwell Cards. Folks have already said that driver is faster than previous drivers. And AFAIK, the 346.02.03f01 Web Driver is not a Beta, it is a final release. The Beta Web Driver for El Cap has not yet been updated to the level of the latest Yosemite Web Driver.

Lou
 
Last edited:
Really interesting how Nvidia publicly acknowledges Maxwell yet there aren't any "official" Maxwell GPUs for Mac - yet they say sure, go ahead and add a Maxwell card to an older Mac Pro. Tacit acknowledgement of our little scene, and egg on Apple's face for doubling down on AMD GPUs? Looks like Nvidia gave up on trying to make a deal with Apple and now are resorting to shaming them. Hope it works.
 
^^^^Yep, you're exactly right - I guess I really didn't look at that aspect, but you are in fact RIGHT ON:p

Lou
 
Last edited:
Really interesting how Nvidia publicly acknowledges Maxwell yet there aren't any "official" Maxwell GPUs for Mac - yet they say sure, go ahead and add a Maxwell card to an older Mac Pro. Tacit acknowledgement of our little scene, and egg on Apple's face for doubling down on AMD GPUs? Looks like Nvidia gave up on trying to make a deal with Apple and now are resorting to shaming them. Hope it works.

That would be funny but there are probably only a couple of thousand cMP machines in the world with Maxwells added. But we are a vocal bunch and as our debates and benchmarks demonstrate we really do not want second best. The beta drivers could signal that Maxwell GPUs are going to be forthcoming, but possibly only in the iMac.

Nvidia and Apple are now competing in the mobile, television set top box and automotive market so there will naturally be politics involved.
 
The Radeon 370X has been released (in Asia) as a budget card to compete against the GTX950. I wonder who will be the first to try it.
 
Small but vocal is right. It can't escape Apple's notice that modern Nvidia GPUs in cMPs are pummeling their AMD counterparts in tests. Would explain why Nvidia is even bothering to keep updating the drivers for such a small userbase.
 
Small but vocal is right. It can't escape Apple's notice that modern Nvidia GPUs in cMPs are pummeling their AMD counterparts in tests. Would explain why Nvidia is even bothering to keep updating the drivers for such a small userbase.

No on both counts. Apple couldn't care less whose chips are best. Long ago, its analysts will have figured out that AMD and Nvidia are close enough in everything but the very top end. And, until Fiji, AMD's much smaller die size has meant it offered the best cost/performance ratio.

The only reason that Nvidia keeps updating its drivers is in the hope that they beat AMD to the next supply contract. To be in with a chance, they need to show Apple that there are drivers ready to go.
 
I agree, Apple couldn't really care less which GPU is at the top. I pretty much can guarantee AMD is offering bargain basement prices for Apple to keep using their parts, the embedded business is nearly all AMD has at this point.
 
GM107 (i.e. the GTX 750 Ti) is way better than the GPU in the new rMBP though. GM204 (GTX 970, GTX 980) is way better than the GPU in the riMac. It's not just the high-end where NVIDIA is ahead, they are clobbering AMD in perf/watt across the entire spectrum.
 
If I still have the cMP in two years I hope there will be a Pascal that fits below 225w. I don't need an upgrade anytime soon as the new Mac drivers, Metal and DX12 will be good.
 
If I still have the cMP in two years I hope there will be a Pascal that fits below 225w. I don't need an upgrade anytime soon as the new Mac drivers, Metal and DX12 will be good.

Given that the GTX 980 was 165W, I think there will absolutely be a Pascal GPU at a sub-225W TDP. Pretty sure NVIDIA learned their lesson from the Fermi architecture.
 
I agree, Apple couldn't really care less which GPU is at the top. I pretty much can guarantee AMD is offering bargain basement prices for Apple to keep using their parts, the embedded business is nearly all AMD has at this point.

Agreed. If Apple wanted the "best" GPUs in their products they would have switched back to nVidia years ago because nVidia has absolutely clobbered ATI in every metric, at every point along the performance continuum, for quite awhile. If ATI's cards get any worse, they will have to pay Apple to use their crap just so they can move product and claim the sales volume.
 
GM107 (i.e. the GTX 750 Ti) is way better than the GPU in the new rMBP though. GM204 (GTX 970, GTX 980) is way better than the GPU in the riMac. It's not just the high-end where NVIDIA is ahead, they are clobbering AMD in perf/watt across the entire spectrum.
The "problem" is that AMD is cheaper across the entire spectrum, and that's the only thing Apple cares about.

That's especially sad when you look at the machines Apple sells and realize that they need low TDP cards for their tiny shiny cases, but they prefer to roast their hardware to save a few $$. In my hackintosh I don't care about TDP at all, so I'll just go for the cheapest option, but Apple should.
 

My Macbook would shut off at that temp.

Most computers would.

With the old GTX470/480 cards I used to see GPU temps of 90C and I was concerned about longevity.

Apple and AMD are running that 5K with a piece radiating 105C, inside with a very pricey screen just mm away. I'll bet $20 that these iMacs will eventually have brown/yellow areas over the GPU. (if the logic board lives long enough)

I have stated it before, the presence of these 2nd rate GPUs proves that Apple isn't the least bit interested in making the BEST performing computers, just the computers that create the image of solidity and are the most profitable.
 
  • Like
Reactions: AleXXXa
Most computers would.

With the old GTX470/480 cards I used to see GPU temps of 90C and I was concerned about longevity.

Apple and AMD are running that 5K with a piece radiating 105C, inside with a very pricey screen just mm away. I'll bet $20 that these iMacs will eventually have brown/yellow areas over the GPU. (if the logic board lives long enough)

I have stated it before, the presence of these 2nd rate GPUs proves that Apple isn't the least bit interested in making the BEST performing computers, just the computers that create the image of solidity and are the most profitable.

Yep. I seriously do not know what I will do when I finally reach a point where a cMP doesn't "cut it" anymore for me but there is no desirable Apple computer left to buy. Not interested in an iMac whose video card will cook the screen, and not interested in a trash can workstation full of proprietary hardware that cannot be upgraded. I could end up an iOS-only Apple user plugging my iPhone, iPad, etc. into a Windows box. Hard to imagine when I have owned Apple computers since the early 90s. I still think back to my beloved Pismo PowerBook - favorite Mac I have ever owned - and realize how far Apple's priorities have shifted away from upgradeability.
 
Yep. I seriously do not know what I will do when I finally reach a point where a cMP doesn't "cut it" anymore for me but there is no desirable Apple computer left to buy. Not interested in an iMac whose video card will cook the screen, and not interested in a trash can workstation full of proprietary hardware that cannot be upgraded.

So many of us are in this exact same boat. Apple doesn't seem to want us any more.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.