Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Archetypemusic

macrumors newbie
Original poster
Jun 15, 2023
8
0
Hi Mac Rumors.

I've got a Mac Pro 5.1 (2010) & 30" Apple Monitor (OSX 10.13.6 High Sierra) that I'm trying to watch 1080p & 1440p video files. My poor ATI Radeon 5770 graphics card starts to blotch out with 1080p and 1440p makes it supper blotchy with un-smooth motion.

I'm not a gamer or am running multiple monitors. I'm just looking for the cheapest and simplest replacement graphic card to make watching an hd movie a smooth and beautiful viewing experience.

What would be great is:
A boot screen
metal
Dual DVI

...or (most preferably) just a more powerful (non-metal) Mac edition card from back in the day just so I can swap it out without any OS/software changes.

Any recommendations would be welcome.

thanks,
AM
 

Attachments

  • 1080p on Mac Pro 5.1 (ATI 5770).png
    1080p on Mac Pro 5.1 (ATI 5770).png
    59.8 KB · Views: 231
  • 1440p on Mac Pro 5.1 (ATI 5770).png
    1440p on Mac Pro 5.1 (ATI 5770).png
    29 KB · Views: 93
  • 2160p on Mac Pro 5.1 (ATI 5770).png
    2160p on Mac Pro 5.1 (ATI 5770).png
    35.5 KB · Views: 78
Last edited:
Cheapest that isn’t super ancient is probably an RX 560 or WX 4100, then RX 580 or WX 7100. If you can do OpenCore with Monterey or Ventura then an RX 6600 XT.

Here’s a list of supported models.

 
Excellent MisterAndrew. Thanks so much for the link. I did find a GTX 680 Mac edition which is my main contender. Plug and play with no flash needed. It would probably around $250 when all is said and done. I'm assuming I can run a 2k movie to my 30" monitor and have smooth gradation and no pixelation. I see a lot of tech specs for all of the GPU's but are any that I should focus on? Spec's that are the key players in the card performance?

Thanks again,

AM
 
So, after prowling around forms, It seems that people don't have an issue viewing 1080p content at full res with the HD5770 card. How come:
1) I'm getting banding when view in 1080p (but the motion is till good)?
2) I'm getting terrible banding and jerky slow playback when view in 1440p?
3) I'm getting terrible banding and jerky supper slow playback when view in 2160p?

Is it the card or is it the computer? I've been trying to find some way to optimize the computer but I can't find any playback/display settings to adjust. I'm looking at at GTX 680 Mac edition card but I'm a bit worried that, if it's actually a computer issue, I'm wasting my money on the card.
 
So, after prowling around forms, It seems that people don't have an issue viewing 1080p content at full res with the HD5770 card. How come:
1) I'm getting banding when view in 1080p (but the motion is till good)?
2) I'm getting terrible banding and jerky slow playback when view in 1440p?
3) I'm getting terrible banding and jerky supper slow playback when view in 2160p?

Is it the card or is it the computer? I've been trying to find some way to optimize the computer but I can't find any playback/display settings to adjust. I'm looking at at GTX 680 Mac edition card but I'm a bit worried that, if it's actually a computer issue, I'm wasting my money on the card.
AFAIK, banding usually due to 8bit colour. Not really resolution related. But the higher resolution you go, the more details you can see. Which "improve" the banding's visibility. Therefore, looks worse.

You setup should be using CPU decode, not GPU decode. Therefore, your CPU is fast enough to decode 1080P video, but not quite fast enough to decode 1440P video, and way below the performance required to decode 2160P video (for that particular codec).

I don't think upgrade from 5770 to GTX680 will help anything for video playback (in your case).

However, if you go all the way to Mojave, upgrade the GPU to RX460 or newer, and use OpenCore to activate HWAccel. Then your cMP will able to use GPU decode. In that case, your cMP will able to play very demanding 2160P HEVC video very smoothly (like this one).
 
Great response h98.

So how can I tell if the CPU is doing the decoding? I can understand 8 bit colour of the hd youtube video (attached screen grabs) but I also get the banding when playing a bluray disc or the equivalent file from a drive. I'm a bit hesitant to do any major changes to my system as I use it for music production and it's quite stable.

Being a noob at this: what is a cMP?

Thanks again,

AM
 
I asked that question when I first started on this forum.. 'cMP' refers to "classic Mac Pro" (4,1 or 5,1) OR "cheesegrater Mac Pro".

Technically, the most correct answer is the "NVIDIA Quadro K5000 Mac Edition" because you get..
- METAL SUPPORT
- NATIVE BOOT SCREENS
- 4K DISPLAYPORT OUTPUT (x2)
- DUAL DVI
- NATIVE PSU REQUIREMENTS MET (2x 6-pin)

and you'll find it much easier to source than a Mac Ed GTX680
 
  • Like
Reactions: AlexMaximus
I asked that question when I first started on this forum.. 'cMP' refers to "classic Mac Pro" (4,1 or 5,1) OR "cheesegrater Mac Pro".

Technically, the most correct answer is the "NVIDIA Quadro K5000 Mac Edition" because you get..
- METAL SUPPORT
- NATIVE BOOT SCREENS
- 4K DISPLAYPORT OUTPUT (x2)
- DUAL DVI
- NATIVE PSU REQUIREMENTS MET (2x 6-pin)

and you'll find it much easier to source than a Mac Ed GTX680

You can get a Sapphire Pulse RX 580 from AliExpress for a little more than 100 bucks, then you can inject EnableGop to the GPU or the Mac Pro BootROM yourself (or pay for a BootROM reconstruction service and get everything done professionally) and have everything, including VideoToolbox hardware acceleration which a NVIDIA GPUs does not have with macOS.

Buying a NVIDIA GPU from 2013/2014 for a MacPro5,1 is a bad decision right now, even if you need CUDA.
 
  • Like
Reactions: h9826790
I did look for the Quadro Mac edition but they're quite expensive if they're genuine. I've seen other recommendations for the Sapphire Pulse RX 580 as a great modern card that works in a Mac. There are actually quite few used on my local Kijiji for $100-$150 so that seem like a good deal compared to double the cost for a decade old card.

That said, I went down the Bluray rabbit hole and found out some interesting stuff. The bluray that I'm using/testing is Whitesnake (an animated movie). Viewing the Bluray in the player with ideer bluray player or viewing the file version on VLC or Jriver Media Center the banding is the same. I monkeyed around with some settings...same. Then, I looked up a review of the bluray and, what do you know, apparently it has a lot of banding. What? I thought Bluray was...amazing compared to DVD. In any event, digging deeper in to bluray banding, apparently it can be baked into the copy depending on the kind of compression used when manufacturing. Who knew?...meaning, I didn't know that.

So now I don't know what I'm seeing? I figure if I can reproduce the bluray version in the software then it's the source. If it keeps changing depending on which software I view the file in, then the issue is in the computer and not on the disk...we'll see.
 

Attachments

  • Screen Shot 2023-06-21 at 3.04.21 PM.png
    Screen Shot 2023-06-21 at 3.04.21 PM.png
    444.5 KB · Views: 62
Great response h98.

So how can I tell if the CPU is doing the decoding? I can understand 8 bit colour of the hd youtube video (attached screen grabs) but I also get the banding when playing a bluray disc or the equivalent file from a drive. I'm a bit hesitant to do any major changes to my system as I use it for music production and it's quite stable.

Being a noob at this: what is a cMP?

Thanks again,

AM
Your 30" ACD is a 8bit monitor. No matter how good the Bluray video quality is, your monitor can only display 8bit colour. If you want to get rid of banding, you need a graphic card that can output 10bit, and a monitor has 10bit colour support. Practically any GPU and monitor support HDR can do that.

If you prefer to keep your existing setup for your music production. Then you have no choice but accept the banding. Also, unable to play some high resolution videos smoothly.

cMP is the nick name of the Mac Pro 5,1.
 
Last edited:
Thanks again h98.

I'm searching gpu's and can't seem to find "10 bit" in any of the card specs. That said, Quadro's appears to do 10 bit with Direct X and get I the sense that the Sapphire 580 can do it in open GL. Does it matter to me? Are they still outputting 10bit via HDIM/DP? The writings I've read so far are for graphics programs on different platforms.

As for monitors, I've found a 32" Benq 4K and a crazy 42" ASUS PG42UQ which, to be honest, is what I paid for the Mac monitor back ion the day....actually a little cheaper.

Any other options for cards/monitors?

Also, would upgrading from an 8 core 2.4gHz to a 12 core "Something" improve, at least, the 2k playback?
 
Last edited:
I'm searching gpu's and can't seem to find "10 bit" in any of the card specs. That said, Quadro's appears to do 10 bit with Direct X and get I the sense that the Sapphire 580 can do it in open GL. Does it matter to me? Are they still outputting 10bit via HDIM/DP? The writings I've read so far are for graphics programs on different platforms.

As for monitors, I've found a 32" Benq 4K and a crazy 42" ASUS PG42UQ which, to be honest, is what I paid for the Mac monitor back ion the day....actually a little cheaper.

Any other options for cards/monitors?
For your info, most of the consumer level products are just 8bit + FRC, not true 10bit, but most of the advertisement still call it 10bit anyway. But 8bit FRC should be good enough for watching videos without banding.

RX580 can do 8bit FRC, no need to worry about that. In fact, almost any modern GPU can do that. 10bit (including 8bit FRC) is the basic requirement for HDR. That's why I said practically any GPU / monitor support HDR can do 10bit.

Even though there is lack of HDR support in High Sierra, but 10bit colour is supported. Even my poorly supported 1080Ti can do 10bit colour in High Sierra.
3840x1080 HiDPI 10bit.png

And I did use the same monitor / OS with RX580. It can display 10bit colour. So, no need to worry about that.

Also, would upgrading from an 8 core 2.4gHz to a 12 core "Something" improve, at least, the 2k playback?
Yes, once you upgrade to dual X5690, your cMP should able to play some 2160P videos. At least, should able to play most popular H264 4K videos.
 
  • Like
Reactions: ZombiePhysicist
Hi Mac Rumors.

I've got a Mac Pro 5.1 (2010) & 30" Apple Monitor (OSX 10.13.6 High Sierra) that I'm trying to watch 1080p & 1440p video files. My poor ATI Radeon 5770 graphics card starts to blotch out with 1080p and 1440p makes it supper blotchy with un-smooth motion.

I'm not a gamer or am running multiple monitors. I'm just looking for the cheapest and simplest replacement graphic card to make watching an hd movie a smooth and beautiful viewing experience.

What would be great is:
A boot screen
metal
Dual DVI

...or (most preferably) just a more powerful (non-metal) Mac edition card from back in the day just so I can swap it out without any OS/software changes.

Any recommendations would be welcome.

thanks,
AM
In watching 4K uhd movies, I'm using a Nvidia K5000 or 680GTX in either Mojave, High Sierra or Catalina in a 2010 cMac Pro. The Sapphire Pulse RX580 is also a very good option. So far the movie plays smoothly without banding or stuttering. It will also depend on the 3rd party video player and its settings. From your original post, I understand you prefer the simplest setup and a not too expensive graphic videocard. Watching 4K movies, I'm using VLC player, the setting in preferences, the cache level is set to "High". The 4K movie is transcoded using an app, MakeMKV and played in VLC.

On the GPU, there are some listed Sapphire PULSE RX580 unflashed, being sold around $80 to $100 in eBay. I saw an old eBay listing of a Sapphire RX580, flashed with boot screen for $155.

Pulse.jpg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.