Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Latest Beta 10.13.3 i think it is.
Luxmark Score was 26807 just with the Vega Card

Interesting, a single VEGA 56 scores about 23000 for me, while dual VEGA hovers around the 45000

To put a comparison in the mix, dual RX 580 scores about 29000-.
All in 10.13.2, all in a Mac Pro 5.1 6 Core 3.46Ghz, all internal.
 
The VBIOS version should show up at here.
Screen Shot 2017-12-21 at 03.51.10.jpg
 
Actually I must admit, I don't know exactly but it looks like that is the one.
Couldn't really find it :(

I tried that one. Exact same behavior as before. The bioses are the same between the reference cards anyways. You can upload your bios to the database using GPU-Z in Windows. That's most likely the one though. There is a second version of the primary bios for the reference 56, but it behaves the same.
 
I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.

But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.

MP_Frontier1.jpg
 
I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.

But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.

MP_Frontier1.jpg
I'm debating of getting a 1080 to replace my RX580...can I just plop in any 1080 and will there be issues such as sleep and fan speed?
 
My MSI 1080 Ti Gaming didn't have any fan issued at the time (Sierra). It was very quiet, which was pleasant.

But I do feel that AMD cards are generally a slightly safer bet in terms of support. It would be nice to see Nvidia jump on the eGPU waggon and really start to actively optimise their cards for High Sierra. I'm not ruling it out.
 
My MSI 1080 Ti Gaming didn't have any fan issued at the time (Sierra). It was very quiet, which was pleasant.

But I do feel that AMD cards are generally a slightly safer bet in terms of support. It would be nice to see Nvidia jump on the eGPU waggon and really start to actively optimise their cards for High Sierra. I'm not ruling it out.

I have a 980 and I was planning on switching to Vega as soon as I could, but then all this mining business grew a wrench in that plan. That being said, I'm hoping AMD sorts out some of the glitchiness I've been dealing with because of Nvidia. I'm primarily doing video, and just want something that plays nice with FCP X and Resolve.

Also, as of 10.13.2 I've been having fan noise problems with my 980, it seems to be spinning at a base of 1170 RPM or higher no matter what, so maybe it's not strictly an AMD thing.
 
I have a 980 and I was planning on switching to Vega as soon as I could, but then all this mining business grew a wrench in that plan. That being said, I'm hoping AMD sorts out some of the glitchiness I've been dealing with because of Nvidia. I'm primarily doing video, and just want something that plays nice with FCP X and Resolve.

Also, as of 10.13.2 I've been having fan noise problems with my 980, it seems to be spinning at a base of 1170 RPM or higher no matter what, so maybe it's not strictly an AMD thing.

That’s a cMP’s SMC bug, nothing to do with the 980. Just run Luxmark (or CUDA-Z etc) for few seconds, then the PCIe fan will back to normal idle.

You can also try SMC reset until the fan back to normal, but it won’t last long.

Or you can use MacsFanControl to build your own fan profile (e.g. base on PCIe ambient), that can also fix the fan issue (you can set MacsFanControl auto load after every boot).
 
I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.

But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.

MP_Frontier1.jpg

I was tempted to buy one of those when Newegg had them for $699. But it seems to have pretty awful reviews. I read it has poor Windows drivers and it runs hot. The TDP is also kind of high. The WX 9100 sounds better to me, but of course it costs more.
 
I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.

But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.

MP_Frontier1.jpg

Pretty much all modern Nvidia cards should have 10Bit out
 
Pretty much all modern Nvidia cards should have 10Bit out

The so called "10-bit" spec of consumer cards like GeForce and normal AMDs is generally limited to certain 3D accelerated modes and/or full screen applications.

Actual 10-bit out from the card is a proper 'pro' feature normally reserved for Quadros and FirePros.

I was tempted to buy one of those when Newegg had them for $699. But it seems to have pretty awful reviews. I read it has poor Windows drivers and it runs hot. The TDP is also kind of high. The WX 9100 sounds better to me, but of course it costs more.

The card looks and feels truly premium, even if this blower style design doesn't look all that special in and of itself. The brushed aluminium and lit up logo and "corner gem" is pretty nice.

It's a niche card, but remember what we're doing is very niche too! It's fine with me that Windows based game reviews totally won't see the point of the card. Its performance is less than a normal Vega 64 and it's more expensive. My card benches around 180000 in GeekBench 4 and I've seen normal Vegas hit 195000 or so.

I was a bit surprised by this since Vega 64 normally sits around 1247 GHz and the Frontier Edition is 1382 GHz with boost up to 1600 GHz.

Anyway, my hope was that this card might find it's wings under High Sierra since it's very similar to what's in the new iMac Pro with 16GB and 10-bit support; that its compatibility might actually be higher than a normal Vega. But so far it's the same.

The 10-bit support is pretty significant to me since my next panels will be 10-bit and probably some flavour of HDR. But it's also possible that normal gfx cards will open up and allow for 10-bit when sending a HDR signal, like they do now for games.

The way forward? Ideally a new batch of 3rd party RX Vega 64 cards will show up soon with the various brands custom fans. It would be nice if some of them would stay in the $500s. Then I'd go with that for price/performance. I'm still hesitant about the 1080 Ti for various reasons.
 
  • Like
Reactions: MisterAndrew
The so called "10-bit" spec of consumer cards like GeForce and normal AMDs is generally limited to certain 3D accelerated modes and/or full screen applications.

Actual 10-bit out from the card is a proper 'pro' feature normally reserved for Quadros and FirePros.





...

The 10-bit support is pretty significant to me since my next panels will be 10-bit and probably some flavour of HDR. But it's also possible that normal gfx cards will open up and allow for 10-bit when sending a HDR signal, like they do now for games.

http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus

Nvidia GPUs, including the GeForce line have had 10bit signal since the 200 series?

I have myself a 10bit monitor and it worked in 10bit color with every single Nvidia we tested, beside the GT120 :)

See also :
https://forums.macrumors.com/thread...r-will-cmp-with-nvidia-drivers-do-it.1931788/

It’s not to say it’s not a nice card, but 10 bit out is just nothing new.

We sent back the batch of VEGA FE we bought at launch, because performance was disappointing. I hope things got better as they did for the regular VEGAs.
 
  • Like
Reactions: MisterAndrew
539A4A8C-A120-48F2-BF27-D72F871AF1C7.png
But are you aware that this link says exactly what I was saying?

No 10-bit support in PS. Only for full screen DirectX apps, i.e. games and such. They then refer users looking for 10-bit support in pro apps to the Quadro line.

I see the link I myself posted is not for macOS. Sorry about that.

Anyways, macOS has had 10bit support and 10bit support in Photoshop, since Ps supported it.

In System Report you can see a test run with a Titan X (Maxwell), running a 4K 10bit screen.

And in the PS CC windows, you can see the 30 bit option available and selected.

(See screenshot)
 
Anyways, macOS has had 10bit support and 10bit support in Photoshop, since Ps supported it.

Allright. This sparked my interest. But I don't think it's quite as simple as you suggest.

The MacOS 'System Report' looks promising. If you see 30-bit under display there, it seems to me it should be active on an OS level in all apps. When doing some googling now, I think something happened around El Captitan and the launch of the 5k iMacs with 10-bit displays (which makes sense).

Apple does like to call their graphics cards "Radeon Pro", suggesting they have some of the feature set of the FirePro and aren't vanilla Radeons. And it might very well be that any modern Radeon card that is dropped into a Mac Pro enjoys the same OS level 10-bit support since the drivers are shared among DeviceIDs to a large extent.

With Nvidia, who creates their own web drivers, there is nothing that would "force" them to support 10-bit color on an OS level, but your screenshot certainly suggests this is the case. I'll go with that then—promising.

Regarding Windows: Photoshop introduced that preference setting for Windows around 2012. But early reports said that simply checking that option wasn't enough. You'd still need driver level support to actually see 10-bit color (as tested by users with 10-bit grayscale ramps)—i.e. you'd still need a pro card. So I'm not sure that box alone guarantees that it actually works. The situation might be different 2017. I'll make a mental note, but it's Windows so it's not that interesting to me… yet.

I also have that option on my 8-bit MacBook Pro, by the way. It's there and it's checked, but I have an 8-bit display. But it might be that 10-bit is supported by my Radeon Pro 460 (for sure) and software (for sure), but that it simply fails at the last step: the display.
 
Allright. This sparked my interest. But I don't think it's quite as simple as you suggest.

The MacOS 'System Report' looks promising. If you see 30-bit under display there, it seems to me it should be active on an OS level in all apps. When doing some googling now, I think something happened around El Captitan and the launch of the 5k iMacs with 10-bit displays (which makes sense).

Apple does like to call their graphics cards "Radeon Pro", suggesting they have some of the feature set of the FirePro and aren't vanilla Radeons. And it might very well be that any modern Radeon card that is dropped into a Mac Pro enjoys the same OS level 10-bit support since the drivers are shared among DeviceIDs to a large extent.

With Nvidia, who creates their own web drivers, there is nothing that would "force" them to support 10-bit color on an OS level, but your screenshot certainly suggests this is the case. I'll go with that then—promising.

Regarding Windows: Photoshop introduced that preference setting for Windows around 2012. But early reports said that simply checking that option wasn't enough. You'd still need driver level support to actually see 10-bit color (as tested by users with 10-bit grayscale ramps)—i.e. you'd still need a pro card. So I'm not sure that box alone guarantees that it actually works. The situation might be different 2017. I'll make a mental note, but it's Windows so it's not that interesting to me… yet.

I also have that option on my 8-bit MacBook Pro, by the way. It's there and it's checked, but I have an 8-bit display. But it might be that 10-bit is supported by my Radeon Pro 460 (for sure) and software (for sure), but that it simply fails at the last step: the display.

Your MacBook 15” 2016 has a 10Bit display (or probably 10Bit dithering) ;)


Edit - just FYI there is debate about this, but gradients are displayed correctly, which doesn’t leave much to speculation it’s only 8Bit, regardless of the driver the LCD uses, and regardless of System Info, which shows 8Bit on the MacBook.

Apple is either applying dithering wisely, or given the dGPU can do 10Bit out, the feature is only active when working with ProApps (although also Preview displays gradients correctly)
[doublepost=1514566582][/doublepost]
...

Apple is either applying dithering wisely, or given the dGPU can do 10Bit out, the feature is only active when working with ProApps (although also Preview displays gradients correctly)

Theory which is actually verified by this screenshot! (it's a 2017 MacBook, but still valid)
 

Attachments

  • Screen Shot 2017-12-29 at 17.54.14.jpg
    Screen Shot 2017-12-29 at 17.54.14.jpg
    1.1 MB · Views: 322
Last edited:
No, I'm afraid not.

Interesting that your MacBook Pro screenshot shows 30-bit display. Never seen that marketed.

Mine is 24 (8 bits per channel).

I know, mine shows 8Bit too.. when the iGPU is in use!

My theory is that Apple switches on either a driver or something, after the dGPU is used (e.g when I opened Photoshop)

Try with yours
 
I know, mine shows 8Bit too.. when the iGPU is in use!

My theory is that Apple switches on either a driver or something, after the dGPU is used (e.g when I opened Photoshop)

Try with yours

Depends on the screen. Polaris GPUs on MacBook Pros show 8 bit when using the laptop screen but 10 bit if using a professional or supported monitor.

Photoshop shows the 30bit option no matter what. It's a dithering technique for proofing managed by the Mercury Engine.
 
Depends on the screen. Polaris GPUs on MacBook Pros show 8 bit when using the laptop screen but 10 bit if using a professional or supported monitor.

Photoshop shows the 30bit option no matter what. It's a dithering technique for proofing managed by the Mercury Engine.

But then:

1 - why are 10bit gradients displayed correctly?

2 - why does system report show the LCD is 10 bit driven, after I opened Photoshop or FCPX?
[doublepost=1514582446][/doublepost]
It's the same regardless what GPU I use. (Late 2016 maxed out MacBook Pro 15")

Have you tried to do things in this order:

- Quit all apps and System Report

- Open Photoshop or other pro app

- Check 30bit option in Preferences/Performance/Advanced

- Open System Report and check

?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.