Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
just wondering if anyone has set up a dual GTX 970 in there cMP yet? interested to know your results, i currently have a GTX 770 and avg from rate seems to be not that far behind the GTX 970 in heaven so i would be guessing the dual set up is the way to go? (FCPX and 3D modelling)

Image

I tested a GTX 970 using only PCIe Booster A (Y splitter to 6pin & 8pin). The card worked exactly as when I had it drawn power from both PCIe Booster A & B. 2 of these GTX 970s will not require an external power supply.
 
I think we have all of the major kinks worked through. The new driver with HDMI support may require some fine tuning but we have:

1. Boot screens on 3,1/4,1/5,1
2. PCIE 2.0 on OS X and Windows, all models
3. Solid, stable performance using latest drivers.

Will hopefully start taking pre-orders late December, shipping first week of January. (assuming we don't find any more bugs)

Was a lot of work, but had some help. The Mac world would be a much duller place without Netkas, where my smoke & mirrors skills run out of steam, he steps up.

Ladies and Gentlemen, I present the GTX980 Mac Edition.

Who was that masked man who has the habit of coming to the rescue when the times look darkest? Thank you, "Lone Ranger" */.

Will it do the same on MacPro 1,1 and 2,1 models?


*/ http://en.wikipedia.org/wiki/The_Lone_Ranger_(2013_film) .
 
I think we have all of the major kinks worked through.

Ladies and Gentlemen, I present the GTX980 Mac Edition.

I've only been aware of your services for a relatively short while but already I have seen some great products (one of which I have in my cMP In the form of BT 4.0 and wifi adapter)... Top job MVC #
 
Merry Christmas, indeed! Good work MVC & Netkas!

Happy Holidays to everyone.

Yes, this is a wonderful thing. Watching the screen flash grey after hitting power button has been a thrill of mine since G4 days. When this is a result of creating something that didn't exist before it really makes me happy.

We can't forget that these EFI boot screens would be rather pointless if not for Nvidia continuing to support us with drivers. Apple would just as soon have Nvidia stop writing these and let the cMP wilt on the vine. Buying Nvidia cards and using them on a Mac will keep these drivers coming.
 
Good that MVC has added PCIE 2. But anyone worrying about that issue should know that PCIE version doesn't matter much for video, and surprisingly even the lane speed doesn't matter as much as people think. Games are far far far away from saturating at bandwidth we have.

Here are benchmarks of the GTX 980 running the most intense games on all possible configurations of PCIE versions and lane speeds. The difference from the slowest configuration to the fastest configuration is just tiny in most cases - probably only interesting to online multiplayer maniacs who want every frame possible.

http://www.techpowerup.com/mobile/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/1.html
 
  • Like
Reactions: Upgrader
Good that MVC has added PCIE 2. But anyone worrying about that issue should know that PCIE version doesn't matter much for video, and surprisingly even the lane speed doesn't matter as much as people think. Games are far far far away from saturating at bandwidth we have.



Here are benchmarks of the GTX 980 running the most intense games on all possible configurations of PCIE versions and lane speeds. The difference from the slowest configuration to the fastest configuration is just tiny in most cases - probably only interesting to online multiplayer maniacs who want every frame possible.



http://www.techpowerup.com/mobile/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/1.html


Thanks for link, interesting read!
 
Thanks for link, interesting read!

It also means you can install graphics cards in the X16 slot 1 and X4 slot 4 and connect them with an extended SLI cable with little speed penalty. Then use the X16 slot 2 for future NVME SSDS that are almost twice as fast as today's. And then X4 slot 3 for USB 3.1.

That's all 40 lanes used up in the most optimum comfiguration a 5,1 can do.

Edit. Would be even better if there was a dual GPU in slot 1 (external power), NVME RAID in slot 2+3, USB 3.1 in slot 4.

A nMP would not be able to touch that spec for a while unless it changed its form factor and added a bigger PSU.
 
Last edited:
  • Like
Reactions: Upgrader
It also means you can install graphics cards in the X16 slot 1 and X4 slot 4 and connect them with an extended SLI cable with little speed penalty. Then use the X16 slot 2 for future NVME SSDS that are almost twice as fast as today's. And then X4 slot 3 for USB 3.1.

That's all 40 lanes used up in the most optimum comfiguration a 5,1 can do.

Edit. Would be even better if there was a dual GPU in slot 1 (external power), NVME RAID in slot 2+3, USB 3.1 in slot 4.

A nMP would not be able to touch that spec for a while unless it changed its form factor and added a bigger PSU.

It's a really nice price/performance ratio right now but in 2 years it will start to show its age, PCIe slots or not. At some point the cMP will be held back by the processor.

Depending on what you do of course, I'm still running a 3,1 Mac Pro with no issues.
 
It's a really nice price/performance ratio right now but in 2 years it will start to show its age, PCIe slots or not. At some point the cMP will be held back by the processor.

Depending on what you do of course, I'm still running a 3,1 Mac Pro with no issues.

Five years at least. Consider the fact that Apple in its introduced of the nMP said the new machine is good for 10 years. Yet the 5,1 specced to the max has equal CPU performance and wins on every other point except Thunderbolt IO.
 
and wins on every other point except Thunderbolt IO.

I thought the nMP RAM was faster than the 5,1 max of 1333mhz, or is it hardly noticeable, or can it be overcome? Also, the next nMP procs support DDR4 RAM - would that be a significant and noticable difference in speed?

Working on my maxed 5,1 3.46ghz 12 core is blindingly fast for most things - I had it down to last me another 3 solid years, good to hear someone give it 5.

Cheers.
 
I thought the nMP RAM was faster than the 5,1 max of 1333mhz, or is it hardly noticeable, or can it be overcome? Also, the next nMP procs support DDR4 RAM - would that be a significant and noticable difference in speed?

Working on my maxed 5,1 3.46ghz 12 core is blindingly fast for most things - I had it down to last me another 3 solid years, good to hear someone give it 5.

Cheers.

The 5,1 and 6,1 maxed out 12 core models run neck and neck in Geekbench (ignoring efficiency) but that only measures CPU and memory. Graphics and storage speed can be upgraded to much faster speeds than a 6,1.

If you're using your machine for media and print production it's good for 10 more years if it doesn't fail. The lack of Thunderbolt is not a big issue as you can use a Mac mini to capture thunderbolt data from something like a 4k video camera, save it to a RAID, and access those files over Ethernet without a need to copy to the 5,1.

Games will be good for 5-6 years, by then a new architecture will replace PCIE completely. Games even at lowest settings will be too powerful and there will probably be no more PCIE video cards.
 
I thought the nMP RAM was faster than the 5,1 max of 1333mhz, or is it hardly noticeable, or can it be overcome? Also, the next nMP procs support DDR4 RAM - would that be a significant and noticable difference in speed?

Working on my maxed 5,1 3.46ghz 12 core is blindingly fast for most things - I had it down to last me another 3 solid years, good to hear someone give it 5.

Cheers.

Higher RAM-Speeds are nothing but smoke and mirrors, just a marketing ploy. Even 1066 MHz DDR3 is not bad compared to current RAM:

http://www.tomshardware.co.uk/forum/314892-30-cl11-what-difference

http://www.computerbase.de/2012-05/test-welchen-ram-fuer-intel-ivy-bridge/3/

There is a less than 2% difference from DDR3 2133 MHz to DDR3-1333 MHz, and a 4.5% difference from DDR3 2133 MHz to DDR3-1066 MHz.
 
Last edited:
Great insight from you both. The satisfaction and sense of achievement I've gained from my 4,1 upgrade project keeps on growing. So pleased I didn't shell out £4,500. for that 8 core 6,1 nMP.

Thanks.
 
To those new comers and those upgrading from very old cards, I want to add that you can power up to 333w graphics card using internal power alone but I am trying to confirm it with a survey.

The various configurations are below:

A. 1 x mini PCIE to 1 x 6 pin connector (75w) + PCIE slot (75w) = 150w
B. 2 x mini PCIE to 1 x 8 pin connector (150w) using adaptor + PCIE slot (75w) = 225w
C. 2 x mini PCIE to 2 x 6 pin connectors (150w) + PCIE slot (75w) = 225w
D. 2 x mini PCIE to 1 x 8 pin connector (150w) using adaptor + Dual SATA to 6 pin adaptor (108w max) + PCIE slot (75w) = 300-333w

You have to sacrifice drive bays for the last one obviously. The main issue is discovering if the SATA rail will be safe. Total system power consumption is also an issue to watch out for.

Most here advise against trying to draw 150w from a single 75w mini PCIE motherboard connector by using a Y splitter even if other people have done it. We don't have long term safety reports.

If the dual GPU GeForce GTX 990 uses 6 pin and 8 pin connectors and consumes 300w or less you should be able to power it using method D ;)

That's 4K gaming at over 60FPS with max settings on a Mac Pro that came out in 2009 :)
 
Last edited:
just wondering if anyone has set up a dual GTX 970 in there cMP yet? interested to know your results, i currently have a GTX 770 and avg from rate seems to be not that far behind the GTX 970 in heaven so i would be guessing the dual set up is the way to go? (FCPX and 3D modelling)

Image

7 FPS is a big difference when your FPS is under 60 to begin with.
Also run the benchmark at ultra/extreme to get a better idea of what the true performance is. Setting the benchmark to 1440p instead of 1080p will further stress cards and lower the score too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.