Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

C_Montgomery

macrumors newbie
Original poster
Sep 15, 2018
4
0
Louisiana, U.S.
So my specs are:
  • Mac Pro (Mid 2010) - running Mac OS High Sierra 10.13.6
  • 2.8 GHz Quad-Core Intel Xeon
  • 32 GB 1066 MHz DDR3 ram
  • ATI Radeon HD 5770 1024 MB
  • Apple 30" Cinema Display (2560 x 1600)
I didn't account for any issues and have been chugging along just fine (after a max RAM upgrade a year ago). I didn't notice until after I updated to High Sierra that my GPU readings (in Activity Monitor) are non existent. I get no readings. I discovered this after opening an older Premiere Pro project that now gives all kinds of "Low Level Exception" errors.

I'd like to know if this is the end of the road for me and the Mac Pro (2010)? If I can get the Video Card upgraded, I'd be more than thrilled to keep chugging along with this ol faithful tower.
My son just purchased a new Video Card and has a hand me down card, that's not all that lacking that might work... or so I'd like to think.

He has an EVGA GeForce GTX 1060 FTW2, 6GB GDDR5 (link):
https://www.evga.com/products/product.aspx?pn=06G-P4-6766-KR

From what I can tell, it's clearly not "Mac" ready, but is there an option for compatibility with the Mac (via drivers, etc.)?

Also, I am overall concerned with how the GPU readings are blank... and can only assume it's because the factory stock card is outdated as far as supported by the current OS. Still, it bothers me to have no readings and seemingly non-existent in performance when tasked with graphic intense apps. Could there be other quirks that I need to address aside from simply upgrading the GPU?

Thanks!
 
That GTX 1060 card should work fine (EDIT: I do not have personal experience w/ this, only 1080 Ti...see notes from BillyBobBongo & flowrider below), but you'll have to pace your system and security updates w/ the release of Nvidia's "Web Drivers" (do a search for that here) in order to keep it working. Inadvertently update your system before the latest Nvidia driver release, and you'll have to figure out another way to get in (remote access, alternate GPU) and update them. A very good choice, and one that is supported by Apple for Mojave, is the Radeon RX 580. Do a search here...or just skim the Mac Pro forum main page— there are seemingly hundreds of threads about it. :D That's not an expensive card, but an even less expensive one, whose low $100s (USD) cost would probably be more than offset by eliminating the hassle of the Nvidia driver updates, is the RX 560.
 
Last edited:
  • Like
Reactions: C_Montgomery
Thanks so much for the tips and heads-ups. I will avoid the potential hassle of the GTX 1060 in that case. The Radeon RX 580 and 560 seem like great options (thanks fhturner) - price-wise and still a significant upgrade to my 1GB'er. I'll definitely go that route... within a couple weeks I'll be able to purchase one and will follow up on how it works out.

What would be my options if the goal is to connect two Apple Cinema 30" Displays? I can do it with my current card, but it did take some aftermarket adapters since the monitors are DVI connections. I ended having to use a "Mini DisplayPort to Dual-Link DVI" adapter on each monitor. I notice most of the cards are single DVI connections. I don't know if it matters, but I need continuous screen spread - not mirrored.

If this is a priority, would a card more tailored for the Mac, like the Sapphire HD 7950 (Mac), that includes 2 Mini DisplayPorts, be a necessity? Do the DisplayPort -to- DVI adapters get the job done?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.