Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Gios93

macrumors newbie
Original poster
Aug 21, 2017
5
1
Chicago
2009 5.1
3.46Ghz W5690 CPU
OWC 128Ghz SSD
1 Tb HD
ATI Radeon 5770
OS Sierra

Hi,

I am new to Mac Rumors, I recently purchased my first Mac Pro which is a 2009 Mac Pro 4.1 with dual 2.26GHz processors and then I flashed it to 5.1. I installed two Intel X5690 Westmere processors after delidding them. My current Geekbench scores are 24,800 with multicore and 3000 with single core. I would like to install a NVDIA GPU and I was curious to the recommendations you could offer me with a reasonable price point, good drivers availability and any specs others have to compare with their installations. If you have any Geekbench scores to compare before and after the GPU install that would be wonderful. Thanks
 
1) All you need to know is in the sticky thread.

https://forums.macrumors.com/thread...out-nvidia-pc-non-efi-graphics-cards.1440150/

2) GPU won't affect CPU GB score

3) If driver availability is your concern (e.g. you prefer native driver support), then your options are almost narrow down to GTX 680.

If you mean "can have proper driver support", then almost all Nvidia cards are properly supported under the current web driver.
 
  • Like
Reactions: Synchro3 and owbp
2009 5.1
3.46Ghz W5690 CPU
OWC 128Ghz SSD
1 Tb HD
ATI Radeon 5770
OS Sierra

Hi,

I am new to Mac Rumors, I recently purchased my first Mac Pro which is a 2009 Mac Pro 4.1 with dual 2.26GHz processors and then I flashed it to 5.1. I installed two Intel X5690 Westmere processors after delidding them. My current Geekbench scores are 24,800 with multicore and 3000 with single core. I would like to install a NVDIA GPU and I was curious to the recommendations you could offer me with a reasonable price point, good drivers availability and any specs others have to compare with their installations. If you have any Geekbench scores to compare before and after the GPU install that would be wonderful. Thanks

I am very happy with a un flashed EVGA GTX 750 ti in mine.
 
If you must go NVIDIA, to me the only reliable option is the GTX 680. I returned my GTX 1070 FE due to the buggy NVIDIA driver for Pascal cards in macOS. Sure, people say it works fine in general use but I, and others, have found it to not be reliable enough, especially if you're doing serious work with your cMP. Users of the 1060 have reported similar issues and I'd bet a 1080/Ti is no different.

If you're happy to go AMD, then it's a no brainer. Wait for availability of the Sapphire RX 580 Pulse 8GB card.
 
  • Like
Reactions: owbp
Have to agree with Squuiid, GTX 680 is still the absolute king for general Mac usage and Apple Pro apps.
Although I don't know if GTX 780 would be better since it also use native Apple Keppler drivers (A1, the first revision of the card)?
Maxwell and Pascal are good if you're buying them for specific apps, gaming and your workflow require CUDA compute.
To buy them for anything else is just waste of the GPU. I sold GTX 980 because GUI slowdowns and glitches and because most of the FCPX rendering was done by the CPU.
680 if giving me much better overall experience in macOS, although I have R9 280X too and I'm swapping them depending on the current mood.
 
  • Like
Reactions: Squuiid
GTX 980 Ti kicks ass in my system. Not had any problems with it. Only complaint is I can't get same day OS updates, but nVidia are very good at speedy web and CUDA driver releases, usually within a day or two.
 
I sold GTX 980 because GUI slowdowns and glitches
Yep, same issues with a 1070. NVIDIA web driver related without a doubt. I removed the 1070, uninstalled the web driver and replaced with my old GTX 680 using macOS native driver and it was like I was using a new machine. All the glitches, UI slowdowns, black screen when resuming, all disappeared instantly.
I find the cMP to be incredible stable, reliable and glitch free when using the 680. It really is impressive.

Plus, there's the added bonus of now being able to install OS updates, including betas, as and when I please. This frustrated me about using the NVIDIA web driver more than I thought it would.

The only reason for me to change the GTX 680 out for an RX 580 is to get DisplayPort 1.4 and HDMI 2.0, thus allowing 5K in future. The performance difference between the two is apparently not that great, and very dependent on which applications you use.
Not sure if an RX 580 has better Metal support than the 680 however.
 
Last edited:
At least with Maxwell GPU's I have no problems (GTX 960, GTX 980 Ti and GTX Titan X). Running with the web driver.

The disadvantage of Kepler GPU's like GTX 680/770 is no boot screen on 4K resolution, even with flashed cards.
 
  • Like
Reactions: NY Guitarist
The disadvantage of Kepler GPU's like GTX 680/770 is no boot screen on 4K resolution, even with flashed cards.
Good point!

I think if you want 4K/5K then you're best off with the Sapphire RX 580 Pulse 8GB anyway.
Native macOS drivers and semi-supported by Apple, albeit no boot screen. You do get recovery mode screens however as the driver is in-built. Not sure if you'd have that in Kepler on 4K. I could be wrong though.
 
I know, I ordered this RX580 two months ago, and I'm still waiting for the card. o_O Very curious to gain my own experiences. First the RX580 is planned as eGPU for my Mac Mini and MacBook Pro.

Let's summarize:
Kepler:
+ native OS driver support, with no issues
- no boot screen in 4K resolution and higher with flashed cards
- mediocre performance

Maxwell:
+ boot screen in 4K and higher resolutions with flashed cards
+- average to better performance, depending on card
+- Nvidia web drivers needed, minor issues

Pascal:
+ boot screen in 4K and higher resolutions with flashed cards
+ best performance
- Nvidia web drivers needed, larger issues escpecially with GTX 1060 cards

That one Sapphire RX 580 Pulse 8GB (not the 4 GB version!):
+ native OS driver support, with no issues
- no boot screen
+- average to better performance
- no availability (at least at the moment)
 
Last edited:
I recently upgraded from dual (flashed) GTX 680 4GB to dual (un-flashed) GTX 1080 TI's. I do mostly CAD design and rendering and needed the most CUDA cores I could get. I was always happy with the 680's and never had a problem with them but the 1080's are smoking fast for rendering. Basically cut my render times to about 1/4th the time of the 680's. A one hour render now takes 15 minutes. I have a GT-120 installed for boot screens if needed.

Dual EVGA GTX 1080 TI FTW3's

EVGA GTX 1080 TI FTW3-11G-P4-6696-KR.png
OctaneBench x16 slots x2.png
 
Dr. Stealth, any issues with heat during rendering? I always spec blower style coolers for multiple card setups.

No heat issues here. I typically stack my renders in a queue during the day then start the queue when I leave for the night. GPU temps when humming at full load hit 55c to 60c. I know lots of people bash the cheese grater look of the cMP but it really is a brilliant design for allowing massive airflow in the front and out the back. (I'm a retired aerospace engineer). Most people who complained wanted Apple to change the design purely for aesthetics, function be dammed. Well, they got what they asked for. =)

I can say this though, when I walk in my office in the morning after a full night of renders my room is very toasty.

Also, I run one card in slot 1 and one in slot 4. I wouldn't want to stack them next to each other as the cards take every bit of the 2-slot space. The backplate of one card will actually seal off the fans of the lower card and may even ride against them.
 
Also, I run one card in slot 1 and one in slot 4. I wouldn't want to stack them next to each other as the cards take every bit of the 2-slot space. The backplate of one card will actually seal off the fans of the lower card and may even ride against them.
Slot 4? Isn't that PCI 2.0 x4 only?
Does that impact performance much on that second 1080Ti?
 
Slot 4? Isn't that PCI 2.0 x4 only?
Does that impact performance much on that second 1080Ti?

I'm no expert on the cMP but I thought the same thing. But, I ran many real-world (my-world) benchmarks using my rendering program before deciding on slot placement. The most difference I saw in rendering the exact same scene using a single card in slot 1 or a single card in slot 4 was about 3%. Not enough to worry about. Plus as I mentioned, if the cards were stacked in slot 1 and slot 2 the lower card would be severely starved for air. Blower style GPU's could probably be stacked but I really wanted the extra bang and speed of the open-air style GPU's.

If interested you can see some of my work here... http://dg-digital.com/
 
Last edited:
I'm upgrading my 680 to a 1080 founders edition chipped for the Mac. The 680 is a pc stock version if your interested no boot screen but works ok might even throw in the two power cables.
 
Dr. Stealth, thanks for the info. I checked out your website - pretty impressive images! If the modular Mac Pro supports 4 full sized GPUs in the chassis, would you populate all 4 slots, or are you at the point of diminishing returns for your workflow (overnight rendering)?

Cheers
 
I'm upgrading my 680 to a 1080 founders edition chipped for the Mac. The 680 is a pc stock version if your interested no boot screen but works ok might even throw in the two power cables.
My two cents...I would think twice about that.
As you can see from my signature, I too have that card and I can tell you that it is garbage in macOS. NVIDIA's Mac driver for Pascal cards is a buggy, glitchy mess.
I ended up dedicating it to Windows only and use it for gaming now.
I returned a 1070 which I was using in my other Mac, for the same reason.

I think you'll be much happier with a Sapphire RX 580 Pulse 8GB. Even without boot screen.

I reverted back to my 680 while I await delivery of the new card, which is on pre-order.
 
Last edited:
No heat issues here. I typically stack my renders in a queue during the day then start the queue when I leave for the night. GPU temps when humming at full load hit 55c to 60c. I know lots of people bash the cheese grater look of the cMP but it really is a brilliant design for allowing massive airflow in the front and out the back. (I'm a retired aerospace engineer). Most people who complained wanted Apple to change the design purely for aesthetics, function be dammed. Well, they got what they asked for. =)

I can say this though, when I walk in my office in the morning after a full night of renders my room is very toasty.

Also, I run one card in slot 1 and one in slot 4. I wouldn't want to stack them next to each other as the cards take every bit of the 2-slot space. The backplate of one card will actually seal off the fans of the lower card and may even ride against them.

Interesting, may I know how to hot air in the PCIe compartment can get out smoothly? Because it is the biggest mystery of the cMP.

In the CPU compartment, that's very clear, the fans direct the air flow to cool the CPU(s) and then pushed out from the back.

For PSU, it's also very clear that the fan keep pushing air through the PSU, and there is enough holes at the back for the air to escape. However, it is not very clear that where the air coming from, there are some holes in the front, but mostly blocked by the optical bay door. And the hot air inside the PCIe slot cannot easily go up to the PSU compartment.

For PCIe compartment. It's clear that the PCIe fan keep sucking in cool air. However, for open fan type GPU, the hot air should circulate inside the cMP. If you have a 2nd card installed in slot 4. Then yes, it will push the hot air out from the back. But unless it's a blower type, I suspect most hot air still leave inside the cMP. Also, if only one open fan type GPU in slot one, then the hot air at above slot 2 should be pretty hard to get out (if all PCIe slots back plates are there).

Or the cMP actually use the CPU exhaust fan to push the PCIe compartment's hot air out? The PCIe compartment is pretty much isolated from the PSU compartment, This is very clear that the side panel don't leave a gap there. But the CPU compartment and PCIe compartment has a large gap in between. The air can be exchanged at there. However, is it really effective if the exhaust is at the bottom to suck out the hot air?
 
Last edited:
  • Like
Reactions: Synchro3
Dr. Stealth, thanks for the info. I checked out your website - pretty impressive images! If the modular Mac Pro supports 4 full sized GPUs in the chassis, would you populate all 4 slots, or are you at the point of diminishing returns for your workflow (overnight rendering)?

Cheers


If I was still actively doing CADD and rendering to make a living I'd certainly think about filling the box with as many GPU's as it could hold. For my rendering program there's nearly a linear increase in performance as you increase the number of CUDA cores. So yes, the old time=money thing would drive me. But, I'm pretty much retired only doing a little freelance work here and there. The work I did during my career has turned into my retirement hobby.
 
Interesting, may I know how to hot air in the PCIe compartment can get out smoothly? Because it is the biggest mystery of the cMP.

In the CPU compartment, that's very clear, the fans direct the air flow to cool the CPU(s) and then pushed out from the back.

For PSU, it's also very clear that the fan keep pushing air through the PSU, and there is enough holes at the back for the air to escape. However, it is not very clear that where the air coming from, there are some holes in the front, but mostly blocked by the optical bay door. And the hot air inside the PCIe slot cannot easily go up to the PSU compartment.

For PCIe compartment. It's clear that the PCIe fan keep sucking in cool air. However, for open fan type GPU, the hot air should circulate inside the cMP. If you have a 2nd card installed in slot 4. Then yes, it will push the hot air out from the back. But unless it's a blower type, I suspect most hot air still leave inside the cMP. Also, if only one open fan type GPU in slot one, then the hot air at above slot 2 should be pretty hard to get out (if all PCIe slots back plates are there).

Or the cMP actually use the CPU exhaust fan to push the PCIe compartment's hot air out? The PCIe compartment is pretty much isolated from the PSU compartment, This is very clear that the side panel don't leave a gap there. But the CPU compartment and PCIe compartment has a large gap in between. The air can be exchanged at there. However, is it really effective if the exhaust is at the bottom to suck out the hot air?


Okay nice to hear from you h9826790.... You brought up a problem that I never really thought about. I don't think the issue of my PCIe bay overheating ever occurred to me. I believe the Engineers at Apple did a very thorough job of insuring it would never be a problem. I don't think it is a real "Mystery" how the hot air gets out. Air is actually much more "Fluid" than Water.

I would have ask you this. If my PCIe bay is overheating what would be the indicator of the problem? High GPU temps?

Thanks.
 
Last edited:
Okay nice to hear from you h9826790.... You brought up a problem that I never really thought about. I don't think the issue of my PCIe bay overheating ever occurred to me. Probably because I believe the Engineers at Apple did a very thorough job of insuring it would never be a problem. I don't think it is a real "Mystery" how the hot air gets out. Air is actually much more "Fluid" than Water.

The cMP case has 5 variable speed fans controlled by either software or firmware. My GPU's each have 9 thermal sensors that control the varible speed of the 3 onboard fans independently. Plus there are many software programs that can monitor and override the fan speeds but I've never had to use them.

If you do a search on Mac Rumors for "My cMP mac is over-heating" you will not find one post. Sometimes I think these sudo problems are just an attempt to make conversation as the real issues with the cMP are so limited.

I would have ask you this. If my PCIe bay is overheating what would be the indicator of the problem? High GPU temps ? Well I have had nothing of the sort. Please enlighten me on the possible issues.

Thanks.

Thanks for your reply. But it seems I confused you a bit, sorry about that.

I have no intention to say the cMP will overheat. I am quite happy with its cooling ability. However, in my own test, with an open fan type card. Open the side panel actually can cool the card for 3-5 degree. Which makes be believe that the PCIe compartment's airflow is not that good. Especially compare to the modern PC case. (For a good design, open the side panel should decrease the cooling ability due to disturb the air flow)

I know there are 5 fans, but 3 of them are almost delicated for CPU only. One of them clearly for optical bay and PSU. Only one sucking the cold air into the PCIe compartment but no "smooth" way out.

I am not saying that my cMP is going to overheat. I just wonder how you determine there is a massive airflow to flow through the PCIe compartment, especially interested in how the hot air goes, and how can it be exhausted effectively.
 
  • Like
Reactions: Synchro3
Thanks for your reply. But it seems I confused you a bit, sorry about that.

I have no intention to say the cMP will overheat. I am quite happy with its cooling ability. However, in my own test, with an open fan type card. Open the side panel actually can cool the card for 3-5 degree. Which makes be believe that the PCIe compartment's airflow is not that good. Especially compare to the modern PC case. (For a good design, open the side panel should decrease the cooling ability due to disturb the air flow)

I know there are 5 fans, but 3 of them are almost delicated for CPU only. One of them clearly for optical bay and PSU. Only one sucking the cold air into the PCIe compartment but no "smooth" way out.

I am not saying that my cMP is going to overheat. I just wonder how you determine there is a massive airflow to flow through the PCIe compartment, especially interested in how the hot air goes, and how can it be exhausted effectively.

Yes, sorry I guess I misunderstood you. I have no heat issues while running two open-air GPU's that can draw up to 280W each. Also in slot 2 I have a Amfeltec PCIe Board with Dual SM951 512GB SSD's. They also produce quite a bit of heat. I am running an Apple GT-120 blower style card in slot 3 so I imagine that helps to some degree to exhaust the PCIe chamber.

So, we know the hot air is getting out. You're just curious as to how it's getting out. It would be interesting to see a smoke test done to follow the airflow.

Cheers !
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.