Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The Vega 64 does not requires additional drivers? macOS Mojave comes with built in drivers for Vega 64? I'm planning to buy a Vega 64 in order to upgrade to Mojave.

The driver is there. In fact, the driver is already working in the current High Sierra. However, the graphic card may suffer from high fan speed issue.

That driver may only decided for the iMac Pro’s Vega 64, can’t control the normal PCIe card’s fan properly.
 
The Vega 64 does not requires additional drivers? macOS Mojave comes with built in drivers for Vega 64? I'm planning to buy a Vega 64 in order to upgrade to Mojave.

Yep. See this thread: https://forums.macrumors.com/threads/amd-polaris-vega-gpu-macos-support.2083168/

However, you can't update the firmware with a Vega card running the latest Mojave beta (but you can by opening the installer under a previous build). So Apple may or may not provide better support for Vega in the Mac Pro 5,1. It looks like they want us to use certain Polaris cards that don't require a power mod.
 
  • Like
Reactions: abdyfranco
PCIe 1.1 = 2.5GT/s
PCIe 2.0 = 5GT/s
PCIe 3.0 = 8GT/s
Important note: PCIe 3.0 at 8gt/s is basically twice as fast PCIe 2.0 at 5gt/s.

PCIe 2.0 (and earlier) uses 20% of the bandwidth for overhead and error checking - leaving 4gt/s for usable data. (8 GB/s usable for an x16 link)

PCIe 3.0 uses about 1.54% of the bandwidth for overhead and error checking - leaving about 7.9gt/s for usable data. (About 15.8 GB/s usable for an x16 link)
 
I ask a question for those that have installed GT120 + RX580: have you tried to export a movie in FCPX or, better, tried the Brucex Test?
I tried and with the 580+GT120 (also without connecting monitors, only installed GT120 in the pci port), my results are very poor (over 5 minutes), it seems that the work is done only by the GT120.
If i try only with the 580, the results are good (35")

It's normal that???

Yes, I can confirm this is the case. I have a PCIe expansion chassis connected to my Mac Pro, and I was going to leave the GT120 in there in case of situations where I needed a boot picker or firmware update. When I measured some render tests and BruceX, I found that the acceleration from the RX 580 was compromised by having both cards in there. It's like FCPX can't help itself and wants to use any available card, despite the hamstringing the GT120 does...and unfortunately there's no preference to tell it not to. So I removed the GT120 and put it on the shelf for emergencies (fortunately the firmware updates are now doable w/o EFI GPU.
[doublepost=1535126366][/doublepost]
Great News!!

I just installed the Mojave public beta, HEVC 4k 60fps decoding is now enabled on my RX 580. I can now playback HEVC 4K 60FPS in QuickTime with very low cpu usage !!! It's super smooth!

HEVC GPU decoding is supported for Rec 709 profile. That's the HD color space so it's nothing special even at 4K.

For Rec 2020 HDR color space there is still not good smooth playback on GPU.

View Get Info on a media file to check the color space.

Thomasthegps, how many of those sample videos did you test out for HEVC decoding? This (hardware decoding for 4K H.264/265) is what I've been dying for...it's the only thing that really hamstrings my Mac Pro for editing compared to my MacBook Pro, since no Quick Sync. I don't mind waiting for encoding to use CPU, but choppy playback is a killer.

Not sure if contrary to what Soy said above, but as of Mojave DP8, I'm only getting good results (GPU engaged and very little CPU) on 10-bit content, NOT 8-bit. But perhaps that's 10-bit Rec709? That camping video is 10-bit, as is the Bravia OLED glass-blowing video, and they play back w/ under 10% CPU usage on my 12-core 3.46GHz machine, at about 30% RX 580 GPU usage.

The others, such as Surfing and Aquarium, are 8-bit, and they utilize a LOT more overall: 300-500% CPU and around 85% GPU. This is the same as I see in High Sierra 10.13.6...well, HS actually uses less GPU. Also, a 4K H.264 test file of my own from my Lumix G85 shows no improvement that I can tell between HS & MJ. HEVC is welcome, of course, but for me, I'd rather have 4K H.264 decode before anything.

Thoughts? Below are my results.
Fred


HEVC Playback/Decode Tests

1. Bravia OLED 4K [73Mbps, 4k60, 10-bit]
10.13.6 HS: CPU 450-650%, GPU 40%
10.14 DP8 MJ: CPU <10%, GPU 30%

2. Surfing [77Mbps, 4k60, 8-bit]
HS: CPU 350-500%, GPU 25-30%
MJ: CPU 350-500%, GPU 85%

3. Aquarium [80Mbps, 4k60, 8-bit]
HS: CPU 300-450%, GPU 25-30%
MJ: CPU 300-500%, GPU 85%

Note that the 8-bit GPU usage in Mojave is quite a bit higher than High Sierra...but the CPU usage doesn't really change.

EDIT: More on this here: https://forums.macrumors.com/threads/hdr-4k-video-support-macos-vs-windows.2130924/#post-26328150
 
Last edited:
Yes, I can confirm this is the case. I have a PCIe expansion chassis connected to my Mac Pro, and I was going to leave the GT120 in there in case of situations where I needed a boot picker or firmware update. When I measured some render tests and BruceX, I found that the acceleration from the RX 580 was compromised by having both cards in there. It's like FCPX can't help itself and wants to use any available card, despite the hamstringing the GT120 does...and unfortunately there's no preference to tell it not to. So I removed the GT120 and put it on the shelf for emergencies (fortunately the firmware updates are now doable w/o EFI GPU.
[doublepost=1535126366][/doublepost]



Thomasthegps, how many of those sample videos did you test out for HEVC decoding? This (hardware decoding for 4K H.264/265) is what I've been dying for...it's the only thing that really hamstrings my Mac Pro for editing compared to my MacBook Pro, since no Quick Sync. I don't mind waiting for encoding to use CPU, but choppy playback is a killer.

Not sure if contrary to what Soy said above, but as of Mojave DP8, I'm only getting good results (GPU engaged and very little CPU) on 10-bit content, NOT 8-bit. But perhaps that's 10-bit Rec709? That camping video is 10-bit, as is the Bravia OLED glass-blowing video, and they play back w/ under 10% CPU usage on my 12-core 3.46GHz machine, at about 30% RX 580 GPU usage.

The others, such as Surfing and Aquarium, are 8-bit, and they utilize a LOT more overall: 300-500% CPU and around 85% GPU. This is the same as I see in High Sierra 10.13.6...well, HS actually uses less GPU. Also, a 4K H.264 test file of my own from my Lumix G85 shows no improvement that I can tell between HS & MJ. HEVC is welcome, of course, but for me, I'd rather have 4K H.264 decode before anything.

Thoughts? Below are my results.
Fred


HEVC Playback/Decode Tests

1. Bravia OLED 4K [73Mbps, 4k60, 10-bit]
10.13.6 HS: CPU 450-650%, GPU 40%
10.14 DP8 MJ: CPU <10%, GPU 30%

2. Surfing [77Mbps, 4k60, 8-bit]
HS: CPU 350-500%, GPU 25-30%
MJ: CPU 350-500%, GPU 85%

3. Aquarium [80Mbps, 4k60, 8-bit]
HS: CPU 300-450%, GPU 25-30%
MJ: CPU 300-500%, GPU 85%

Note that the 8-bit GPU usage in Mojave is quite a bit higher than High Sierra...but the CPU usage doesn't really change.

EDIT: More on this here: https://forums.macrumors.com/threads/hdr-4k-video-support-macos-vs-windows.2130924/#post-26328150

I tested a few 4k files off the internet. Most of them were demos for sony/lg/samsung.
I too would like 4k x264 hw decode. Some previous amd gpus were able to decode x264 4k so my guess is that polaris can aswell. I had also noticed several similar instabilities with decoding. Soemtimes decoding crashed and I had to quit and reopen the application. This happended with quicktime and vlc.I was hoping that these issues would be fixed by now...
 
Yesterday I bought Asus ROG Strix RX560 4GB in Mac Pro octocore 4.1 flashed 5.1 with macOS Sierra not HS. Apple specify a MSI RX560.

My Asus card will be supported in Mojave for me? Or I should return the purchase and buy another card?

Iread here about a news firmwares installed in High Sierra or Mojave but I have Sierra now. I should install High Sierra before Mojave for get this firmware updates in my Mac Pro?

Thanks in advance
 
Yesterday I bought Asus ROG Strix RX560 4GB in Mac Pro octocore 4.1 flashed 5.1 with macOS Sierra not HS. Apple specify a MSI RX560.

My Asus card will be supported in Mojave for me?

Yes.

Iread here about a news firmwares installed in High Sierra or Mojave but I have Sierra now. I should install High Sierra before Mojave for get this firmware updates in my Mac Pro?

Thanks in advance

Apple documentation requires 10.13.6 installed in your Mac. First the Mojave app will upgrade your firmware to 138.0.0.0.0 then the installation starts.

What's your current BootROM (System Information>Hardware>BootROM Version)?
 
  • Like
Reactions: xc70
Yes.



Apple documentation requires 10.13.6 installed in your Mac. First the Mojave app will upgrade your firmware to 138.0.0.0.0 then the installation starts.

What's your current BootROM (System Information>Hardware>BootROM Version)?
Ufff thanks for your fast reply because I'm on vacation for a few days. When I return home I will check the bootrom version.

Thanks again
 
Hi, I'm reading through this trying to understand which cards are compatible. I'm in the UK and trying to find a Sapphire 7950 is really difficult. The only one available on eBay is £330 and doesn't even include the power cables.

I'm now looking at the GTX680, they are selling the flashed version for £120.

I'm currently running a 3 display setup with 2xRadeon 5770 cards. I've been having loads of problems since upgrading to High Sierra with red boxes on my screen and being logged out when moving between fullscreen apps.

I mainly use my mac for editing code and some Final Cut pro, would the GTX680 be a good option? I think I can daisy chain 2 of my displays via display port and my third display is a TV with HDMI.
 
Hi, I'm reading through this trying to understand which cards are compatible. I'm in the UK and trying to find a Sapphire 7950 is really difficult. The only one available on eBay is £330 and doesn't even include the power cables.

I'm now looking at the GTX680, they are selling the flashed version for £120.

I'm currently running a 3 display setup with 2xRadeon 5770 cards. I've been having loads of problems since upgrading to High Sierra with red boxes on my screen and being logged out when moving between fullscreen apps.

I mainly use my mac for editing code and some Final Cut pro, would the GTX680 be a good option? I think I can daisy chain 2 of my displays via display port and my third display is a TV with HDMI.
As @crjackson2134 stated just get one of these, which is approved by Apple for Mojave:
Sapphire Pulse Radeon RX 580 8G GDDR5 Dual HDMI/DVI-D/Dual DP Graphics Card - Black
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Fantastic, that's just what I needed, a definite push in the right direction.
[doublepost=1536934605][/doublepost]
Just get a Sapphire Pulse RX580 and call it a day.
Thank you so much, I almost paid £330 for a second hand card with nowhere near the performance and no cables.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
NVIDIA cards will need NVIDIA Web Drivers to properly work for both flashed and non-flashed GPUs. Until the NVIDIA Web Drivers are released for Mojave, it is best to stick with AMD. (The Sapphire PULSE RX 580 is officially recommended and works.)

Worth mentioning - it is not yet 100% confirmed that NVIDIA Web Drivers for Mojave will be released, but many expect them to be available. Older flashed GPUs like the officially released GTX 680 for Mac appear to work fine with Beta releases, but some newer series (like the 9XX and 10XX) have presented several issues that will not be addressed until drivers are available. Newer generation NVIDIA GPUs do not have native drivers built into the OS and will rely on the NVIDIA Web Drivers for full function.
 
  • Like
Reactions: crjackson2134
Important note: PCIe 3.0 at 8gt/s is basically twice as fast PCIe 2.0 at 5gt/s.

PCIe 2.0 (and earlier) uses 20% of the bandwidth for overhead and error checking - leaving 4gt/s for usable data. (8 GB/s usable for an x16 link)

PCIe 3.0 uses about 1.54% of the bandwidth for overhead and error checking - leaving about 7.9gt/s for usable data. (About 15.8 GB/s usable for an x16 link)

My Vega 56 is showing as 8.0 GT/s on my Mac Pro using this latest firmware. But I assume even if it's being seen that way, that it is still operating at PCIe 2.0 speeds?
 
Keep in mind that has no boot screens so can be a royal pain. I'm living with it now but seriously considering getting a flashed card from http://www.macvidcards.com/store/c10/Recommended:_Mac_Pro_4,1_and_5,1.html

A comment on the Amazon page says you get boot screen:
A good investment for Mac Pro 5.1. Running Sierra 10.12.6 or after you get boot screen as well. Pleased.

I'm guessing you guys know better, maybe he's getting confused with a partial boot screen?

I pulled the trigger on the card, arriving tomorrow. I was going to sell both my 5770s, maybe I should keep one around for boot problems.
 
A comment on the Amazon page says you get boot screen:


I'm guessing you guys know better, maybe he's getting confused with a partial boot screen?

I pulled the trigger on the card, arriving tomorrow. I was going to sell both my 5770s, maybe I should keep one around for boot problems.

@TC!!
It doesn’t come with the needed power cable. You need to get a Dual-Mini-6pin to single GPU-8pinn Y cable before installing.
[doublepost=1536944795][/doublepost]
Keep in mind that card has no boot screens so can be a royal pain. I'm living with it now but seriously considering getting a flashed card from http://www.macvidcards.com/store/c10/Recommended:_Mac_Pro_4,1_and_5,1.html

I don’t need to keep it in mind, I already have one. Not a royal pain to me at all.
 
Last edited:
My Vega 56 is showing as 8.0 GT/s on my Mac Pro using this latest firmware. But I assume even if it's being seen that way, that it is still operating at PCIe 2.0 speeds?

Which "latest" do you mean (latest official or latest beta), this one?
138.0.0.0.0.png


This is a good tool to test if your card is running at PCIe 2.0

https://itunes.apple.com/hk/app/cl-ing/id1244392031?l=en&mt=12

CL!ing HiDPI.png
 
  • Like
Reactions: skeptech
Hi, I'm reading through this trying to understand which cards are compatible. I'm in the UK and trying to find a Sapphire 7950 is really difficult. The only one available on eBay is £330 and doesn't even include the power cables.

I'm now looking at the GTX680, they are selling the flashed version for £120.

It doesn't have to be a Sapphire. It just needs to have the same ports: One DVI, one HDMI and two mini-Display Ports. Avoid cards that deviate from that set up. Newer releases of the HD7xxx/R9 2xx series have two DVI ports, one HDMI and one full size Display Port. Those cards have issues; some can be flashed but not all the ports work.

My previous 7950 was a Gigabyte Windforce. That flashed like a charm. if you don't like eBay, take a look at your local CEX. If it has a 7950, 7970, an R9 280 or an R9 280X with the requisite ports then any of these cards will do. Just have a read of the flashing thread on netkas.org as several makes are quoted; those that work and those that don't.

My current card is an MSI Twin Frozr R9 280X. Keep an eye out for that if you can't find the ageing Sapphire.
 
It doesn't have to be a Sapphire. It just needs to have the same ports: One DVI, one HDMI and two mini-Display Ports. Avoid cards that deviate from that set up. Newer releases of the HD7xxx/R9 2xx series have two DVI ports, one HDMI and one full size Display Port. Those cards have issues; some can be flashed but not all the ports work.

My previous 7950 was a Gigabyte Windforce. That flashed like a charm. if you don't like eBay, take a look at your local CEX. If it has a 7950, 7970, an R9 280 or an R9 280X with the requisite ports then any of these cards will do. Just have a read of the flashing thread on netkas.org as several makes are quoted; those that work and those that don't.

My current card is an MSI Twin Frozr R9 280X. Keep an eye out for that if you can't find the ageing Sapphire.

Not entirely true.

Even has the same port layout, the framebuffer personality can still be different.

e.g. For HD7000 series, all

Hamachi (4) @ 0x59480
Aji (4) @ 0x59170
Dashimaki (4) @ 0x592b0
Futomaki (4) @ 0x59420
DashimakiS (4) @ 0x597b0

has the same DP, DP, DVI-D, HDMI ports layout, but only the one that use Hamachi can guarantee all ports will still work after flash.

It's just much better to get a card that has the same port layout, but not completely bullet proof.
 
This is a good tool to test if your card is running at PCIe 2.0

I've never heard of CL!ng before, I've always used CudaZ for such a purpose. Seems like the results are different. But, in any case, even though PCIe shows me running at 2.5 GT/s, both of these say different, as CudaZ always has.

TinyGrab Screen Shot 9-14-18, 9.52.07 AM.png


TinyGrab Screen Shot 9-14-18, 2.21.09 PM.png


Lou
 
I've never heard of CL!ng before, I've always used CudaZ for such a purpose. Seems like the results are different. But, in any case, even though PCIe shows me running at 2.5 GT/s, both of these say different, as CudaZ always has.

View attachment 781813

View attachment 781814

Lou

IMO, CL!ng is a better tool to just confirm the 5GT's capability. It's direct from Apple store, and work for any GPU.

As long as the Host <-> Device reading is at the 5000MB/s range, the card is negotiating at PCIe 2.0 speed.

I won't care too much about the performance figure difference. Those are OpenCL figures, and specifically for that coding only. Can't be compared to the CUDA figures.
 
  • Like
Reactions: skeptech
Yes, I'm running the latest 138.0.0.0.0 firmware. Just ran this and it's negotiating at PCIe 2.0.

My Device to Device is showcasing a much higher number than yours though... any reason for that?

upload_2018-9-14_17-48-42.png
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.