Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JohnBu

macrumors newbie
Original poster
May 4, 2018
16
4
I'm just starting to look at upgrading my faithful 2009 Mac Pro (2.93 Quad, 16GB RAM, GTX680, Mac OS 10.11).

Thus far I've successfully reflashed the firmware to 5,1 and ordered a W3690 processor. I understand that once the processor arrives and I get it installed I will be able to use faster RAM (1333). I'm just a bit worried about buying the right memory. I've found the following ( https://www.megamac.com/collections...3mhz-pc3-10600-ddr3-ecc-sdram-owc1333d3w8m32k ) but can anyone confirm if this should work fine in my reflashed 4,1/5,1 ? If so my understanding is the memory would work at 1066 with the old processor and then at 1333 once the new processor is installed?

Also my graphics card is currently a PC NVIDIA 680 GTX (this was just plug and play really with the Nvidia web drivers). I've read through a lot of threads here about graphics card compatibility but what I'm missing is an understanding of what is good value for money. I'd be happy to pay approx. £400 for something that provided a substantial upgrade to the 680. The RX580 seems to get a lot of good press but if it originally retailed for £200 odd and now you can't find it for less than £310 so it feels like there could be a value for money issue there. At risk of getting lynched for asking what do people think would be the best value fit for my upgraded system and budget?

Thanks in advance for any help and guidance :)


John
 
May sound like a weird question, but why do you need a new GPU?
What are you using this machine for?
Can any of your applications utilize GPU processing (CUDA, OpenCL, Metal)?
What monitor(s) are you using and what resolution(s) are you trying to run?

Ultimately, your better spend might be on additional RAM vs GPU if you're running only 16GB.

I'm running a GTX 1080 FE in my 5,1 but I use applications that utilize CUDA and OpenCL processing. Not everyone is in that situation.
 
the ram will work fine with both processors, your current one and new one after you install it. so you have the right understanding about how current CPU limits to 1066Mhz and new one would allow 1333Mhz. HOWEVER I would recommend using a DIMM kit that has 3 DIMMs as using the triple memory gives you the most optimal speed. So either 3 x 8GB = 24GB total, or 3 x 16GB = 48GB total. Only use that 4th slot unless you REALLY need the extra 8GB or 16GB of RAM. So if you are not maxing out whatever software with current RAM using 3 DIMMS is better IMO

I will let others chime in on the GPU upgrade because I myself am shopping around for the same type of upgrade. But yes does seem the RX 580 is a compatible and decent mid range upgrade. The Sapphire Pulse version seems to be compatible fairly well with the cMP
 
May sound like a weird question, but why do you need a new GPU?
What are you using this machine for?
Can any of your applications utilize GPU processing (CUDA, OpenCL, Metal)?
What monitor(s) are you using and what resolution(s) are you trying to run?

Ultimately, your better spend might be on additional RAM vs GPU if you're running only 16GB.

I'm running a GTX 1080 FE in my 5,1 but I use applications that utilize CUDA and OpenCL processing. Not everyone is in that situation.

While the 680 has been reliable and gave decent performance to anything I've thrown at it I also like to game on my Mac Pro (my guilty pleasure :) ). I use RISC OS for a hobby/fun and I have my Mac Pro for everything else. I've not liked the direction Apple are taking the desktop OS and the Pro desktop range so have decided just to push the envelope on my current Mac Pro for the time being and get a few more years of usage out of it before I need to think about getting a new system.

I'm using a 24" Samsung at the moment and run at 1920 x 1080. But again I might well upgrade the monitor in the near future as I'm on a bit of a roll.
[doublepost=1525459899][/doublepost]
the ram will work fine with both processors, your current one and new one after you install it. so you have the right understanding about how current CPU limits to 1066Mhz and new one would allow 1333Mhz. HOWEVER I would recommend using a DIMM kit that has 3 DIMMs as using the triple memory gives you the most optimal speed. So either 3 x 8GB = 24GB total, or 3 x 16GB = 48GB total. Only use that 4th slot unless you REALLY need the extra 8GB or 16GB of RAM. So if you are not maxing out whatever software with current RAM using 3 DIMMS is better IMO

I will let others chime in on the GPU upgrade because I myself am shopping around for the same type of upgrade. But yes does seem the RX 580 is a compatible and decent mid range upgrade. The Sapphire Pulse version seems to be compatible fairly well with the cMP

That's great news - thanks for your help. Although now I have to decide how far to push the envelope - £129 vs £279 :) Edit: Ordered the 48GB (3 x 16GB) in the end :)
 
Last edited:
Not a gamer, so cannot comment if AMD is better than NVIDIA or not for your needs. But if you just want one GPU with simple power connection/setup, the GTX 1080 FE is very straight forward with a dual mini 6-pin to standard 8-pin cable and should work out of the box since you're (probably) already running the NVIDIA web drivers with your GTX 680. NVIDIA does sell these directly without the markup, when they have stock available.

There seems to be several people using AMD cards, but since they do not work with CUDA I've never seriously considered them for my needs. There was a thread about POSSIBLE workaround to enable hardware acceleration for video encode/decode with an AMD GPU, but that never really went anywhere IIRC.

I'd suggest to get some more RAM and then evaluate your system before dropping a lot of money into it. The good thing about stock GPUs is they can be moved to another machine later on.
 
You haven't mentioned storage. If you don't already have SSD installed, I heartily recommend it. Even at SATA 2, there's a very nice performance boost, and that even from the "value" less expensive SSD brands. (A situation where value actually means it!) If you decide to chase I/O performance, you can do a SATA III controller for example the Apricorn units, or go to a PCIe direct SSD. For that last you'll either need High Sierra or an AHCI unit, though; pre-HS OSX doesn't talk NVMe which is by far the more common PCIe SSD protocol.
 
Not a gamer, so cannot comment if AMD is better than NVIDIA or not for your needs. But if you just want one GPU with simple power connection/setup, the GTX 1080 FE is very straight forward with a dual mini 6-pin to standard 8-pin cable and should work out of the box since you're (probably) already running the NVIDIA web drivers with your GTX 680. NVIDIA does sell these directly without the markup, when they have stock available.

There seems to be several people using AMD cards, but since they do not work with CUDA I've never seriously considered them for my needs. There was a thread about POSSIBLE workaround to enable hardware acceleration for video encode/decode with an AMD GPU, but that never really went anywhere IIRC.

I'd suggest to get some more RAM and then evaluate your system before dropping a lot of money into it. The good thing about stock GPUs is they can be moved to another machine later on.

Thanks for the tips! Bearing in mind what you've said about power considerations (a great point - thanks!) I've done a little research and found the Gigabyte GTX1080 Gaming 8G also has the single 8 pin power connection and runs a little cooler than the Founder Edition. Amazon had it for just a few pounds more than the FE from NVIDIA so I've pulled the trigger. Just need to find a suitable cable.

I appreciate your point about waiting and seeing but if I did change systems in the very near future it would be to a Linux build as I don't feel like Apple have a new equivalent to the cheese grater - and apparently the 1080 has pretty decent support on that platform as well so I think it's a safe investment :)
[doublepost=1525517720][/doublepost]
You haven't mentioned storage. If you don't already have SSD installed, I heartily recommend it. Even at SATA 2, there's a very nice performance boost, and that even from the "value" less expensive SSD brands. (A situation where value actually means it!) If you decide to chase I/O performance, you can do a SATA III controller for example the Apricorn units, or go to a PCIe direct SSD. For that last you'll either need High Sierra or an AHCI unit, though; pre-HS OSX doesn't talk NVMe which is by far the more common PCIe SSD protocol.

You make a valid point. All the other upgrades are plug and play (albeit carefully :) ) and I think I'm happy with what I've ordered/planned. But I am a little nervous about upgrading to Sierra/High Sierra, switching to a SSD plus HDD set-up and not having lost any of my data. But it really would be good to get that done before my upgrades start arriving in the post. I'll start doing some research but if anyone has any tips/hints please feel free to share.

Edit: I didn't realise how much SSD drives had come down in price! It might be easier just to get a 2 TB SSD drive and copy across a clone of my current installation. Hmm.

Edit 2: Ordered a Crucial MX500 2TB :) Just need to investigate the PCI SATA 3 card situation.. Will there be space to fit that in with a GTX 1080?
 
Last edited:
  • Like
Reactions: h9826790
Thanks for the tips! Bearing in mind what you've said about power considerations (a great point - thanks!) I've done a little research and found the Gigabyte GTX1080 Gaming 8G also has the single 8 pin power connection and runs a little cooler than the Founder Edition. Amazon had it for just a few pounds more than the FE from NVIDIA so I've pulled the trigger. Just need to find a suitable cable.

I appreciate your point about waiting and seeing but if I did change systems in the very near future it would be to a Linux build as I don't feel like Apple have a new equivalent to the cheese grater - and apparently the 1080 has pretty decent support on that platform as well so I think it's a safe investment :)
[doublepost=1525517720][/doublepost]

You make a valid point. All the other upgrades are plug and play (albeit carefully :) ) and I think I'm happy with what I've ordered/planned. But I am a little nervous about upgrading to Sierra/High Sierra, switching to a SSD plus HDD set-up and not having lost any of my data. But it really would be good to get that done before my upgrades start arriving in the post. I'll start doing some research but if anyone has any tips/hints please feel free to share.

Edit: I didn't realise how much SSD drives had come down in price! It might be easier just to get a 2 TB SSD drive and copy across a clone of my current installation. Hmm.

Edit 2: Ordered a Crucial MX500 2TB :) Just need to investigate the PCI SATA 3 card situation.. Will there be space to fit that in with a GTX 1080?

Your cMP can accommodate 4 PCIe cards including the graphic card.
 
Just a quick update to say I've managed to upgrade from El Capitan to High Sierra. Although I did run into the 'won't boot after successful installation of High Sierra' problem. Fortunately booting into recovery mode and performing a disc repair seemed to do the trick and it's working now. Next step will be to clone my HDD to copy it over to the SSD when it arrives.

The processor is on a slow boat from China but the rest of the items should be arriving shortly so I'll (hopefully) be able to start upgrading this weekend :)
 
i went with a 580 simply because osx has a dedicated driver for amd cards now. i game fortnite on it with high settings

plenty of cheaper 580 on ebay if you look around, i just went with a regular blower version instead of pulse and it works fine.
 
i went with a 580 simply because osx has a dedicated driver for amd cards now. i game fortnite on it with high settings

plenty of cheaper 580 on ebay if you look around, i just went with a regular blower version instead of pulse and it works fine.

Good to know you've had a good experience! And yes it does seem like Apple is more in the ATI camp at the moment :)

General Update: Surprising the processor upgrade is the first part of my plan I can complete so did that today. Geekbench before (i.e. no upgrades - 2.93 Ghz Quad, 16GB 1066 RAM, GTX 680, HDD etc.) was 2662 Single and 8750 Multi. After installing the 3.46 Ghz 6 Core Westmere Geekbench came back with 3113 Single and 13540 Multi - quite a nice improvement :)

Will run again once my faster RAM arrives and again once everything else is installed.
[doublepost=1526223614][/doublepost]Query: Looking at the sensor readings I've got CPU A Tdiode at 33; CPU A Core 0 Relative to ProcHot at 70 and CPU A Heatsink at 30. Is that normal? What does the relative mean?

When I reconnected the heatsink I was careful to turn the hex screws rotating each turn across the others and stopped when I felt firm resistance (i.e. didn't want to over tighten and break anything).

Also strangely if I put some strain on the processor Tdiode goes up and the relative goes down e.g Tdiode 57; relative 37.

EDIT: I think I've been silly. I found a mention on another thread '
In iStat, Also pay attention to 'CPU Core0 Relative to....' reading. It means relative to 'Prochot'. Prochot (or 'Processor Hot) is, if I'm remembering correctly, the maximum temp that your cores should reach before there is trouble. Istat, I believe, defines this as 100C or 212F.

It's a countdown measurement, so the HIGHER the temp, the better. See how low the reading gets when your stressing your CPU.'

So basically it sounds like Tdiode/Heatsink are giving me my actual readings (which are great idling and not bad under stress) and the Relative means my temperatures relative to the point it becomes an issue are also fine?
 
Last edited:
i think the most accurate method for torqueing is count the number of turns coming out
Perhaps a point of clarification...the most accurate method for applying the correct torque is with a torque wrench, like those used for rifle scopes and such. They measure inch-pounds quite well. From the Mac Pro 2009 Technician Guide, the recommended installation is first to 4 in-lbs, then to 8 in-lbs, in the specified order.
 
Perhaps a point of clarification...the most accurate method for applying the correct torque is with a torque wrench, like those used for rifle scopes and such. They measure inch-pounds quite well. From the Mac Pro 2009 Technician Guide, the recommended installation is first to 4 in-lbs, then to 8 in-lbs, in the specified order.

No sure if we can still use this method on an unprotected socket with a “wrong” CPU. This procedure is designed for the “correct” CPU.

Good to know you've had a good experience! And yes it does seem like Apple is more in the ATI camp at the moment :)

General Update: Surprising the processor upgrade is the first part of my plan I can complete so did that today. Geekbench before (i.e. no upgrades - 2.93 Ghz Quad, 16GB 1066 RAM, GTX 680, HDD etc.) was 2662 Single and 8750 Multi. After installing the 3.46 Ghz 6 Core Westmere Geekbench came back with 3113 Single and 13540 Multi - quite a nice improvement :)

Will run again once my faster RAM arrives and again once everything else is installed.
[doublepost=1526223614][/doublepost]Query: Looking at the sensor readings I've got CPU A Tdiode at 33; CPU A Core 0 Relative to ProcHot at 70 and CPU A Heatsink at 30. Is that normal? What does the relative mean?

When I reconnected the heatsink I was careful to turn the hex screws rotating each turn across the others and stopped when I felt firm resistance (i.e. didn't want to over tighten and break anything).

Also strangely if I put some strain on the processor Tdiode goes up and the relative goes down e.g Tdiode 57; relative 37.

EDIT: I think I've been silly. I found a mention on another thread '
In iStat, Also pay attention to 'CPU Core0 Relative to....' reading. It means relative to 'Prochot'. Prochot (or 'Processor Hot) is, if I'm remembering correctly, the maximum temp that your cores should reach before there is trouble. Istat, I believe, defines this as 100C or 212F.

It's a countdown measurement, so the HIGHER the temp, the better. See how low the reading gets when your stressing your CPU.'

So basically it sounds like Tdiode/Heatsink are giving me my actual readings (which are great idling and not bad under stress) and the Relative means my temperatures relative to the point it becomes an issue are also fine?

Correct, as long as that Relative to ProcHot is not zero, you should be fine.
 
No sure if we can still use this method on an unprotected socket with a “wrong” CPU.
Can you elaborate on this?

Here's my thinking on using a torque wrench: It measure the twisting force being applied to the fastener, which has a strong correlation to the amount of force applied to the two objects being fastened (variables include how smooth/frictionless the threads are and how much friction there is between the fastener head and the object). I believe measuring with a torque wrench would account for variations (within limits) in thickness of the CPU.

If I understand this correctly, maintaining the proper tightness is really about applying enough force between the heatsink and the CPU to enable proper heat transfer, but not too much to break it across a range of operating temperatures. My issue with the "turn-count" method is it somewhat difficult to actually determine the actual starting and ending points. When, exactly, does the screw finish coming out, and when, exactly does it start going in?

I get that the turn-count method works, in practice. But I don't see how that can be more reliably accurate. But if there is a better method, I'd like to understand that.
 
Can you elaborate on this?

Here's my thinking on using a torque wrench: It measure the twisting force being applied to the fastener, which has a strong correlation to the amount of force applied to the two objects being fastened (variables include how smooth/frictionless the threads are and how much friction there is between the fastener head and the object). I believe measuring with a torque wrench would account for variations (within limits) in thickness of the CPU.

If I understand this correctly, maintaining the proper tightness is really about applying enough force between the heatsink and the CPU to enable proper heat transfer, but not too much to break it across a range of operating temperatures. My issue with the "turn-count" method is it somewhat difficult to actually determine the actual starting and ending points. When, exactly, does the screw finish coming out, and when, exactly does it start going in?

I get that the turn-count method works, in practice. But I don't see how that can be more reliably accurate. But if there is a better method, I'd like to understand that.

The problem is the socket, not the contact between CPU and heatsink.

IMO, there are 2 cases.

1) All "weight" (including the pressure from the heatsink) is supported by the socket only

2) All "weight" is supported by the socket + something else (e.g. the four red circle areas in the pic)
Ww1QbkqpIDxJEXAs[1].jpg

If it's case 1. Then I believe use torque wrench will be fine. Because the "weight" change from "CPU + heatsink pressure" to "CPU + heatsink + lid" should be very tiny (the pressure between CPU and heatsink should be quite strong, otherwise the thermal paste can't work properly. And relatively, the lid's weight is negligible).

However, I tends to believe we are actually belongs to case 2. The torque suppose related to "resistance from the CPU" + "resistance from other supports (e.g. the red circle areas)". Without washer, the heatsink's pressure now will solely apply on the CPU, but missing the "something else". Which means the socket now may experience much more pressure than normal.

To elaborate more. Let's say the manual suggest to use torque = 9

And it expect when the heatsink get into the optimised position. There will be some "resistance" from the red circle area. Therefore, suggest torque = 9, but actually that's CPU =5 + each red circle area = 1

Without washer, but we still apply torque = 9, now all pressure will go to the CPU, and that's much more pressure than the original situation. Therefore, this extra pressure may damage the socket.
 
  • Like
Reactions: barmann
Update:

Installed the new NVIDIA card and used the cable adaptor suggested above and all of this worked fine :)

I've installed the 48 GB RAM today. However it's only running at 1066 according to About my Mac. I've searched online and saw people saying reset PRAM and then it recognised it fine. Unfortunately despite having a wired keyboard plugged directly into the Mac Pro's USB port it wouldn't accept option-command-P-R. So I searched a found a way to reset it through the terminal (sudo nvram -c). This seemed to work because when I restarted I had no display (NVIDIA 1080 was plugged in :D ).

So .. I've put in the old graphics card and got back into the desktop and it's still showing 1066. So I've done a SMC reset, PRAM/NVRAM reset and it's still not working at 1333. I've installed the x3 RAM modules in slots 1-3 and I'm using a Westmere 3.46 6-core so it should recognise the RAM is 1333. Any other suggestions?

The RAM I have bought was OWC1333D3X9M048 48GB PC10600 (16GBX2) Mac Pro DDR3 EEC EAN: 0794504328257.
 
How does System Information > Memory report the slots and each individual DIMM?
Do all Manufacturer and Part Numbers match?
 
It gives me the following report (at least it's consistent) : -


Memory Slots:




ECC: Enabled

Upgradeable Memory: Yes



DIMM 1:



Size: 16 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: -

Serial Number: -



DIMM 2:



Size: 16 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: -

Serial Number: -



DIMM 3:



Size: 16 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: -

Serial Number: -



DIMM 4:



Size: Empty

Type: Empty

Speed: Empty

Status: Empty

Manufacturer: Empty

Part Number: Empty

Serial Number: Empty
 
Therefore, this extra pressure may damage the socket.
Thanks for sharing your explanation on this. If I understand it correctly, you are saying that torque needs to be properly supported, which is to be expected. You mention washers in Case 2. Is this a reference to using lidded CPUs in a dual socket 4,1?
 
Manufacturer: DIMM 1, DIMM 2, and DIMM 3 should all have a manufacturer number of some kind. The 0x0000 may be a signal there is an issue with the RAM or installation. (Mine currently show as 0x80CE, which I believe is Samsung.)

Part Number: All should ID a fairly long part number. (Mine show 0x followed by 36 digits/characters.)

Serial Number: Should not be blank and would not be identical between each DIMM. (Mine show 0x followed by 8 digits/characters.)

I'm going to guess there is an issue with your system identifying the RAM, which is basically causing the 1333MHz speed issue. Try troubleshooting the RAM identification and that should (hopefully) resolve the speed issue. Depending on where you purchased this RAM, may be worth exchanging.

Do you have any of your older sticks available?
Can you insert one stick into DIMM 1 to see if your machine can ID that RAM properly?
 
I've put the old RAM back in and the report gives the following (it does give a part number but the rest is similar): -


Memory Slots:



ECC: Enabled

Upgradeable Memory: Yes



DIMM 1:



Size: 4 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: 0x256832473800553641465238432D48392000

Serial Number: -



DIMM 2:



Size: 4 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: 0x256832473800553641465238432D48392000

Serial Number: -



DIMM 3:



Size: 4 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: 0x256832473800553641465238432D48392000

Serial Number: -



DIMM 4:



Size: 4 GB

Type: DDR3 ECC

Speed: 1066 MHz

Status: OK

Manufacturer: 0x0000

Part Number: 0x256832473800553641465238432D48392000

Serial Number: -
 
That's a good sign. The RAM slots shouldn't be damaged in any way if they can ID part numbers.

Where did you buy the 3x16 OWC RAM modules?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.