Latest Beta 10.13.3 i think it is.
Luxmark Score was 26807 just with the Vega Card
What is your bios version? Is it this one? https://www.techpowerup.com/vgabios/194835/xfx-rxvega64-8176-170719
Latest Beta 10.13.3 i think it is.
Luxmark Score was 26807 just with the Vega Card
Latest Beta 10.13.3 i think it is.
Luxmark Score was 26807 just with the Vega Card
What is your bios version? Is it this one? https://www.techpowerup.com/vgabios/194835/xfx-rxvega64-8176-170719
The VBIOS version should show up at here.
View attachment 742913
Actually I must admit, I don't know exactly but it looks like that is the one.
Couldn't really find it
I'm debating of getting a 1080 to replace my RX580...can I just plop in any 1080 and will there be issues such as sleep and fan speed?I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.
But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.
My MSI 1080 Ti Gaming didn't have any fan issued at the time (Sierra). It was very quiet, which was pleasant.
But I do feel that AMD cards are generally a slightly safer bet in terms of support. It would be nice to see Nvidia jump on the eGPU waggon and really start to actively optimise their cards for High Sierra. I'm not ruling it out.
I have a 980 and I was planning on switching to Vega as soon as I could, but then all this mining business grew a wrench in that plan. That being said, I'm hoping AMD sorts out some of the glitchiness I've been dealing with because of Nvidia. I'm primarily doing video, and just want something that plays nice with FCP X and Resolve.
Also, as of 10.13.2 I've been having fan noise problems with my 980, it seems to be spinning at a base of 1170 RPM or higher no matter what, so maybe it's not strictly an AMD thing.
I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.
But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.
I've got a Vega FE running in my Mac Pro now. Probably temporary. Booted no problem but I've also got the fans running too high. The card isn't hot. Too bad... I found a used card and might keep it since it looks so sweet in my Mac, but I'll probably replace it with another 1080 Ti for more 'bang-for-buck'.
But it's an interesting card that sits between games and production: 16GB RAM and 10-bit out for monitors that support it. Needs proper drivers though.
Pretty much all modern Nvidia cards should have 10Bit out
I was tempted to buy one of those when Newegg had them for $699. But it seems to have pretty awful reviews. I read it has poor Windows drivers and it runs hot. The TDP is also kind of high. The WX 9100 sounds better to me, but of course it costs more.
The so called "10-bit" spec of consumer cards like GeForce and normal AMDs is generally limited to certain 3D accelerated modes and/or full screen applications.
Actual 10-bit out from the card is a proper 'pro' feature normally reserved for Quadros and FirePros.
...
The 10-bit support is pretty significant to me since my next panels will be 10-bit and probably some flavour of HDR. But it's also possible that normal gfx cards will open up and allow for 10-bit when sending a HDR signal, like they do now for games.
http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus
Nvidia GPUs, including the GeForce line have had 10bit signal since the 200 series?
But are you aware that this link says exactly what I was saying?
No 10-bit support in PS. Only for full screen DirectX apps, i.e. games and such. They then refer users looking for 10-bit support in pro apps to the Quadro line.
Anyways, macOS has had 10bit support and 10bit support in Photoshop, since Ps supported it.
Allright. This sparked my interest. But I don't think it's quite as simple as you suggest.
The MacOS 'System Report' looks promising. If you see 30-bit under display there, it seems to me it should be active on an OS level in all apps. When doing some googling now, I think something happened around El Captitan and the launch of the 5k iMacs with 10-bit displays (which makes sense).
Apple does like to call their graphics cards "Radeon Pro", suggesting they have some of the feature set of the FirePro and aren't vanilla Radeons. And it might very well be that any modern Radeon card that is dropped into a Mac Pro enjoys the same OS level 10-bit support since the drivers are shared among DeviceIDs to a large extent.
With Nvidia, who creates their own web drivers, there is nothing that would "force" them to support 10-bit color on an OS level, but your screenshot certainly suggests this is the case. I'll go with that then—promising.
Regarding Windows: Photoshop introduced that preference setting for Windows around 2012. But early reports said that simply checking that option wasn't enough. You'd still need driver level support to actually see 10-bit color (as tested by users with 10-bit grayscale ramps)—i.e. you'd still need a pro card. So I'm not sure that box alone guarantees that it actually works. The situation might be different 2017. I'll make a mental note, but it's Windows so it's not that interesting to me… yet.
I also have that option on my 8-bit MacBook Pro, by the way. It's there and it's checked, but I have an 8-bit display. But it might be that 10-bit is supported by my Radeon Pro 460 (for sure) and software (for sure), but that it simply fails at the last step: the display.
...
Apple is either applying dithering wisely, or given the dGPU can do 10Bit out, the feature is only active when working with ProApps (although also Preview displays gradients correctly)
Your MacBook 15” 2016 has a 10Bit display (or probably 10Bit dithering)
No, I'm afraid not.
Interesting that your MacBook Pro screenshot shows 30-bit display. Never seen that marketed.
Mine is 24 (8 bits per channel).
I know, mine shows 8Bit too.. when the iGPU is in use!
My theory is that Apple switches on either a driver or something, after the dGPU is used (e.g when I opened Photoshop)
Try with yours
Depends on the screen. Polaris GPUs on MacBook Pros show 8 bit when using the laptop screen but 10 bit if using a professional or supported monitor.
Photoshop shows the 30bit option no matter what. It's a dithering technique for proofing managed by the Mercury Engine.
It's the same regardless what GPU I use. (Late 2016 maxed out MacBook Pro 15")