Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jav6454

macrumors Core
Original poster
Nov 14, 2007
22,303
6,263
1 Geostationary Tower Plaza
Ever wonder why Apple suddenly went with ATI/AMD with almost every Mac's GPU selection?

Well, ladies and germs, boys and squirrels, the answer is simple. nVidia produces GPUs that either:

A. Burn down
B. Have high power consumption
C. Barely within 1% of competition
D. Made from woodscrews and with terrible 1.7% yields

However, let me allow nVidia or rather one of their GPUs show you what I mean.

nVidia GPU burned/zapped.

Not even a moderate overclock. This is nVidia's flagship competing vs AMDs HD6990 and it couldn't even hold an overclock down without blowing up.

Don't get me wrong, nVidia has made great GPUs, but as of late, their offerings are really bad in the high end sector [GTX460 and GTX560 are exceptional products for their prices].
 
Apple switches on a regular basis. the performance of the high-end chips is irrelevant, none of them go in Macs anyway.
 
I see that other seem to be blaming the drivers (267.52 to be specific), but given the physical location of the spark (just beneath the PSIG power connectors), it appears to be the Voltage Regulator that went, not the GPU (would have been bigger and a lot more smoke). That OC seemed rather mild IMO as well, and it should have been able to take it (increase of 0.065V from stock).

This isn't meant as a defense of the 590 or nVidia (I'm not a big fan of their products these days), but this one appears to be a different issue, not a poorly designed or manufactured GPU. Granted, their GPU's are pigs power wise, but whoever designed/manufactured the PCB should have built a better voltage regulator (one that can handle the load). Decreasing voltages and clocks (...) via drivers is a cheaper fix however than recalling the initial cards that have already been produced, even if they're still at the manufacturing facility sitting on pallets (i.e. PCB rework to replace the parts that are blowing with those capable of withstanding the load).

Afterall, the information I've seen, is that expected pricing is $600. Not a budget card by any means, so they should have done better.
 
I always wonder about agendas in threads such as this...corporate, political or other biases often seep into "facts".

I think Nvidia and ATI make great products for the PC. I'm hopeful that we'll see more of the high end cards for Macs with Lion. We'll see.

I wouldn't hesitate to buy from either vendor, but here is how i see it:
--Nvidia rocks with CUDA...and offer more for for 3d professionals and video editors (at least for now as FCP has been left dormant while Premiere leads. Could change any day.)
--Nvidia's drivers on Mac aren't so good (I blame Apple)
--Nvidia options are almost non-existent on the Mac right now (I blame Apple)
--Nvidia cards might run a little hotter and thus require an extra db or two for fan noise.

I would love to see the world standardize on OpenCL over CUDA...but that's a ways off.
 
Last edited:
I always wonder about agendas in threads such as this...corporate, political or other biases often seep into "facts".
From my perspective, its based mainly on technical issues, such as high power consumption, excessive heat/insufficient cooling, poor drivers, low yields (drives up costs) and high failure rates (GeForce 8 & 9 series in particular). I also have issues with their business practices as well since they still don't publish documentation, only provide closed drivers = binaries only, and still to date, have failed to come clean (full acknowledgement) of the failure rates & causality to the GeForce 8 & 9 series (users as well as vendors have been burnt by this - pun intended :eek: :p).

So there's some validity IMO, not just an "I don't like them", but can't give any solid evidence to support it sort of thing.
 
Well, ATI/AMD cards have pretty much always performed better in OS X for the same approximate power level.

I think that's almost entirely due to drivers, but personally, I see no reason to bother with Nvidia if the situation is never likely to improve, and it really hasn't, yet.

And if drivers are being improved to the point where we'll have multiple choices that don't even necessitate flashing, well, then OS X users will be running out of reasons to use Nvidia cards unless they need a Quadro.
 
That happens if you use the old GeForce 267.52 drivers. GeForce 267.71 supposedly fixed this issue.

Currently AMD is ahead of NVidia but like in the past, it can change very quickly.

Afterall, the information I've seen, is that expected pricing is $600. Not a budget card by any means, so they should have done better.

Actually, it will be priced at 700$ (to compete with AMD 6990 which goes for ~700$), at least according to AnandTech.
 
nvidia hmm yes cheap and ready available in huge numbers some work if you leave them stock or add a bigger heatsink or faster fan as nvidia seems often a bit optimistic in cooling capabilities of their choosen fans /heatsinks
.. anyway i am a ATI fan anyway since the mid 80's and would never let a nvidia gpu even near any computer in my possession , no matter how fast it might be and only fit them on special demand
 
Apple switches on a regular basis. the performance of the high-end chips is irrelevant, none of them go in Macs anyway.

Seein as how this thread is in The Mac Pro section and how Mac Pros can be shipped with the 5870. I'd say this qualifies alot as high end.

Thy switch according to how well the manufacturer makes heir product. Current AMD is leaps ahead of nVidia and house burning GPUs.
 
--Nvidia rocks with CUDA...and offer more for for 3d professionals and video editors (at least for now as FCP has been left dormant while Premiere leads. Could change any day.)

How so? CUDA is the exact same thing as OpenCL, except with vendor lock in.

Apple made the smart move here.
 
How so? CUDA is the exact same thing as OpenCL, except with vendor lock in.

Apple made the smart move here.

The only advantage is that CUDA is much more available right now and used more. However, OpenCL is getting there. Windows already leverages is and OS X started with SL.
 
The only advantage is that CUDA is much more available right now and used more. However, OpenCL is getting there. Windows already leverages is and OS X started with SL.

But CUDA is used more because NVidia is still pushing CUDA hard.

This is why CUDA bothers me. NVidia has a cross platform solution available to them, but they're still pushing the proprietary solution.
 
That happens if you use the old GeForce 267.52 drivers. GeForce 267.71 supposedly fixed this issue.
A driver that was released to the public which causes your hardware to blow up is inexcusable.

I'm guessing Apple switched for the same reason HP and everyone else did - they got burned bad by the 8 and 9 series mobile chip failures. Yes, it was nvidia's fault that the chips were bad, but the laptop vendors were the ones stuck with the enormous cost of replacing failed system boards. Plus their lost customers who got frustrated with one laptop failing after another. I know five people including myself who purchased HP laptops with the 8600m GT. All 5 of these failed in under a year. Only 3 were covered by HP's warranty. The other 2, despite having the 8600m GT chip, were declined by HP as "unaffected models". Unaffected my ass. I hope nvidia loses a lot of business.
 
But CUDA is used more because NVidia is still pushing CUDA hard.

This is why CUDA bothers me. NVidia has a cross platform solution available to them, but they're still pushing the proprietary solution.

But CUDA is NVidia only so it's not a surprise that they are pushing it. As long as some software is using CUDA instead of OpenCL, the people who use that software will prefer NVidia GPUs. They are doing the same thing with PhysX from what I have read. Paying game developers to use PhysX.

When they can't fight against AMD in raw performance, they have to use grey methods to create at least some market for their products.
 
I think the reason is much closer to earth.

AMD probably outbid nVIDIA on unit price, hence winning this round.
 
fermititle.jpg
 
But CUDA is NVidia only so it's not a surprise that they are pushing it. As long as some software is using CUDA instead of OpenCL, the people who use that software will prefer NVidia GPUs. They are doing the same thing with PhysX from what I have read. Paying game developers to use PhysX.

When they can't fight against AMD in raw performance, they have to use grey methods to create at least some market for their products.

Exactly, true a customer with an nVidia GPU gets better performance, but at what price? Progress on the GPU performance is hindered.

Ever see the "Meant to be played" slogan pop up in games? That's nVidia paying developers to optimize games for nVidia GPUs. Of course its going to look as if nVidia has an upper hand. In reality they don't and benchmarking tools show this.

Also, comedic relief:

Fermididegypt.jpg
 
This is why CUDA bothers me. NVidia has a cross platform solution available to them, but they're still pushing the proprietary solution.

From a business standpoint, NVIDIA's reasoning for this is pretty sound. NVIDIA can market CUDA and make it exclusive to their cards. They ink business partnerships with companies like Adobe (i.e. Premiere Pro) to lock their customers into buying high-end NVIDIA cards. It isn't about CUDA being better than OpenCL or vice-versa. It's about artificially creating a competitive edge.

NVIDIA did a similar thing with PhysX, in a way. They gobbled the tech up from Ageia (PhysX started as a standalone, add-in card that ran alongside a machine's primary graphics card) and made it exclusive to their GPUs. NVIDIA even went as far as intentionally crippling software (CPU hosted) PhysX in order to promote the sale of a PhysX enabled NVIDIA card.

This all seems pretty deceptive, but in the end, it's business.
 
Nvidia Geforce 7800 GT

Hello,

In addition to my 6-core Mac Pro I acquired a mint-like Power Mac G5 Quad w/ the better LCS in it. My Quad came with the Nvidia Geforce 7800 GT. So far I notice this is a very LOUD video card.

Can someone confirm that this video card is a lousy one, due to its noise that it produces? I'd figure I'd ask this question here as many 1,1 Mac Pro owners used to have the 7300 GT for their mac pros.

Is there a firmware update to silence this card? The fan is huge on this thing.

Thanks,

Josh
 
I'm using the CUDA GTX 285 because I use Adobe Premiere CS5/After Effects. If and when Adobe decides to allow AMD cards to have the same GPU acceleration I'd be more than happy to switch back.

Pretty sure Apple just doesn't want to support NVIDIA's proprietary CUDA tech (Plus it makes the current FCP seem slower than dirt). Plus only Apple wants to have proprietary tech on their machines. ;)
 
It's not a case of allowing GPU acceleration, it's a case of explicitly supporting OpenCL...

I would have hoped by now that OpenCL would have seen more traction, but CUDA did get to market first, as it were. Regrettably.

But these things take time.
 
Hello,

In addition to my 6-core Mac Pro I acquired a mint-like Power Mac G5 Quad w/ the better LCS in it. My Quad came with the Nvidia Geforce 7800 GT. So far I notice this is a very LOUD video card.

Can someone confirm that this video card is a lousy one, due to its noise that it produces? I'd figure I'd ask this question here as many 1,1 Mac Pro owners used to have the 7300 GT for their mac pros.

Is there a firmware update to silence this card? The fan is huge on this thing.

Thanks,

Josh

Another failure round of nVidia.
 
CUDA will only give them traction for so long. Heck, I'm a CUDA developer and even I gave up on NVidia after the 10.6.3 driver fiasco. Used to have a dual NVidia GPU system. Now one of those GPUs is replaced with a 5870.

I'd also take NVidia more seriously if they supported Tesla on the Mac.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.