Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

netkas

macrumors 65816
Original poster
Oct 2, 2007
1,198
395
With recent changes and discontinuation of 27-inch non-retina imacs there is no more macs with nvidia chip in it.

Pretty sad news for us cMP owners.

Thanks to nvidia tho for providing web drivers.
 
Instead we now can order a M380, which pretty sure is a re-rebadged HD 7770, isn't that great?! :D

And there is no option that I can see to add a discrete GPU to the 21.5" 4K model. Now you have to buy a a 27" model to enjoy the privilege of an AMD GPU. And admittedly OT, but I also find it appalling that the base models still ship with a regular mechanical hard drive. Fusion Drive should be the floor by now.
 
This isn't to start a panic or anything, but I do wonder how long nVidia will continue to write drivers for Mac products when Apple sells no nVidia chipset and although there are a number of cMP using nVidia cards they surely must be less than 1% of nVidia users???

I mean they are a business and must be paying someone to develop these and therefore expecting to see a payback.

I do find it sad that Apple has completely moved away from nVidia as they seem to have the upper hand of performance versus power. Oh well maybe there will be a Mac Pro shaped surprise next year with a nVidia chip???
 
  • Like
Reactions: 762999
Well. Most of owners of cMP aren't poor guys and usually buying high-end products, like 980ti and titanX and upgrade them once new powerful cards come out.

These cards pay themself back if used for living.

These cards might make enough $ for nvidia to keep supporting macs.
 
Last edited:
if i'm reading this correctly the R9 M395X is SLOWER then the R9 M295X used in the last gen retina iMac? (tbh once they thermal throttle their tits off they will be the same speed LOL) Im hoping apple is only using AMD because there in some sort contract with AMD and I hope that if thats the case once its over they switch to nvidia graphics
 
if i'm reading this correctly the R9 M395X is SLOWER then the R9 M295X used in the last gen retina iMac? (tbh once they thermal throttle their tits off they will be the same speed LOL) Im hoping apple is only using AMD because there in some sort contract with AMD and I hope that if thats the case once its over they switch to nvidia graphics

Apple can get AMD chips for next to nothing because they are good for next to nothing. And since Apple has always kneecapped its computers with old/subpar GPUs for as long as I can recall, it's a "win-win" for Apple and AMD and a lose for us.
 
Well, I think AMD is for the best since they came up with the integrated memory controller Nvidia will be using as well.
 
  • Like
Reactions: AleXXXa
The HSA foundation is the ONLY reason why Apple would go for AMD, apart from cheaper parts(Fiji and Grenada parts are those that AMD is not willing to sell at a bargain price, Lisa Su knows how to make business in cash income, after all - thats why we see same old GPUs with new names in the "new" iMac.

The problem is that Apple is right now milking their customers, and are not giving hardware that would explain the prices of iMac and Mac Pro.
 
The HSA foundation is the ONLY reason why Apple would go for AMD, apart from cheaper parts(Fiji and Grenada parts are those that AMD is not willing to sell at a bargain price, Lisa Su knows how to make business in cash income, after all - thats why we see same old GPUs with new names in the "new" iMac.

The problem is that Apple is right now milking their customers, and are not giving hardware that would explain the prices of iMac and Mac Pro.

Yeah I think I'll have to agree to disagree with your analysis. If HSA was the most important thing to Apple, why did they go and invent their own graphics and compute API with Metal? Why haven't they released an OpenCL 2.0 implementation with its shared virtual memory? I've never seen Apple push HSA, did they mention it at this year's WWDC for example? From what I've seen, they're going all-in with Metal and are probably putting all the other APIs in maintenance mode.

It's not the first time Apple has completely changed discrete GPU vendors (there were zero Fermi-based products from Apple, for example). The pendulum swings back and forth every few years, probably to keep the other company from completely abandoning their platform. The bigger concern is really how pervasive the Intel GPUs are becoming, for example there isn't even an option to get a discrete GPU in the 21" 4K iMac.
 
Don't worry Asgorath, I've asked myself those questions either.

Unfortunately I have no answer for them. But from engineering point of view pushing for highest hardware capabilities and support rather than software would explain AMD cards and HSA foundation. The other part is the Metal. It completely ruins that view. Maybe it is only for "simplicity" purposes and unifying porting and functioning of apps on Mac platform? Maybe it will evolve ultimately in future in something bigger, and there are few clues for that.

ARM and ImaginationTechnologies lately have been added to the HSA foundation as contributors. We know that Metal has started life as mobile API. It may ultimately evolve into bigger platform than it is right now, that can work both on Mobile, and Desktop hardware.
 
Check out the specs on the upcoming Pascal line of GPUs.

http://wccftech.com/nvidia-pascal-gpu-17-billion-transistors-32-gb-hbm2-vram-arrives-in-2016/

Think we'll be lucky enough that Nvidia will still be adding GPUs to their web driver by then?
Clearly Apple has doubled down on AMD.

And with NVIDIA unlocking their web driver so it can be installed on official Apple products other than the Mac Pro, I think it's fair to say NVIDIA is doubling down on their web driver. I'd be pretty surprised if we don't see a GP100 driver for OS X basically as soon as the GPU gets released.
 
Wait til the GPU melts out of back of one onto the desk. That'll get'em jumping ship.

As AMD engages in more rebadging consumers will notice. You can only rename a turd so many times.
 
Wait til the GPU melts out of back of one onto the desk. That'll get'em jumping ship.

As AMD engages in more rebadging consumers will notice. You can only rename a turd so many times.
Neither can you polish a turd
Or push a turd uphill
Need I go on :)
 
Please don't kill this thread with pointless bickering.... (And it takes two to bicker...)

Joke_over_your_head.jpg
 
Makes this LinkedIn job posting all the more mysterious:

https://www.linkedin.com/jobs2/view/61305436?trk=jserp_job_details_text

"join the NVIDIA Mac graphics driver team and help produce the next revolutionary Apple products"
"Working in partnership with Apple"

Did Apple sign a secret, timed deal with AMD and they're planning to jump ship after the iMac 105˚C GPU fiasco?
One can only hope.

Nvidia is always prepping cards to compete with AMD for updates. That doesn't mean they always ship. I know MacVidCards has at least one of these cards. Apple basically does a bakeoff, and then decides which one to ship.

I think previously the card that lost the bakeoff would go on to be an aftermarket card (see: GTX 285, Radeon 3870, etc.)

It's possible AMD has an exclusivity deal, but I'd bet the bigger issue is that Nvidia's OpenCL performance has been horrible. They probably get trashed by AMD at the FCPX benchmarks.
 
Nvidia is always prepping cards to compete with AMD for updates. That doesn't mean they always ship. I know MacVidCards has at least one of these cards. Apple basically does a bakeoff, and then decides which one to ship.

I think previously the card that lost the bakeoff would go on to be an aftermarket card (see: GTX 285, Radeon 3870, etc.)

It's possible AMD has an exclusivity deal, but I'd bet the bigger issue is that Nvidia's OpenCL performance has been horrible. They probably get trashed by AMD at the FCPX benchmarks.

Have you tried FCPX with the latest NVIDIA web drivers? I saw a pretty massive improvement with those. Yes, NVIDIA still struggles with apps like Luxmark, but that's because the ray tracing algorithm was tuned for the AMD architecture.

http://barefeats.com/gtx980ti.html

That shows a massive improvement in CL performance, in at least the OceanWave benchmark.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.