Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Im using Vega_Tab64.kext .... for a while!! :D

"Fan Speed(%)"=71
"Fan Speed(RPM)"=1775
"Temperature(C)"=32
:apple:
 

Attachments

  • Captura de Tela 2018-09-24 às 23.03.30.png
    Captura de Tela 2018-09-24 às 23.03.30.png
    900.4 KB · Views: 238
  • VegaTab_64.kext.zip
    3.7 KB · Views: 357
  • Like
Reactions: h9826790
Can confirm VEGA FRONTIER EDITION works fine, except the fan issue that goes away after playing an HEVC video.

VEGA 64 LC works fine, but also has a fan issue, fan speed drops under load, only rises (slightly) when target temperature is reached, but never reaches target speed or maximum speed. When load is removed the fan speeds up to cool the GPU, but again, doesn't reach target speed. Bug report open with Apple, provided another sysdiag this morning.
 
TBH, no idea. My GPUTest was downloaded few years ago.
Anyway, run the GPUTest_GUI inside package should give you a decent GUI to run Furmark.

Alright, found it, bit of a hassle to get it running, only runs via command-line for me.
Scores are a bit wonky though, I'm faster than a crossfire Vega64 according to furmark, but my GPU frequency is around 1400MHz (I've clocked mine to go up to 1800MHz)
Screenshot 2018-09-26 at 12.02.48.png
 
Hi,

I would like to know if Vega or Polaris cards are true 10bit per color (not 8 + 2 src) with 4k resolution (I need true 10 bit for my work) and HS or Mojave.

Thanks in advance
 
There's a new Pro Vega 64 GPU listed in the driver for 10.14.1 which has VBIOS version 113-D05001A1XTOD. That's similar to the Pro Vega 64 in the iMac Pro which has VBIOS version 113-D05001A1XT-017. It uses a new frame-buffer called Sleipnir, which has 4 DP & 1 HDMI ports. It appers to be an eGPU, most likely similar to the Blackmagic Pro 580.

Sleipnir.png Sleipnir 2.png Sleipnir 3.png
 
Last edited:
Hi,

I would like to know if Vega or Polaris cards are true 10bit per color (not 8 + 2 src) with 4k resolution (I need true 10 bit for my work) and HS or Mojave.

Thanks in advance

It is showing as true 10bit on my Mojave install using the RX Vega 56 connected to my 10bit monitor at 4K.
 
It is showing as true 10bit on my Mojave install using the RX Vega 56 connected to my 10bit monitor at 4K.

How can you be sure about true 10bit for Vega 56?

On AMD site, for Vega56 or Vega64 I don't see any reference about 10bit: instead for the Frontier Edition it's clearly indicated.
 
How can you be sure about true 10bit for Vega 56?

On AMD site, for Vega56 or Vega64 I don't see any reference about 10bit: instead for the Frontier Edition it's clearly indicated.

All Radeon are outputting 10 bit on MacOS and Windows to 10 bit capable monitors.

GeForce only outputs 8 bit on both OS.

Go to System Information app. Choose ‘graphics’ and you will see the color depth ARGB101010
 
All Radeon are outputting 10 bit on MacOS and Windows to 10 bit capable monitors.

GeForce only outputs 8 bit on both OS.

Go to System Information app. Choose ‘graphics’ and you will see the color depth ARGB101010

Yes but if the GPU and/or monitor were 8+2 frc bit, you still get ARGB101010: am I wrong?
 
Yes you still get 10 bit (with dithering).

Ok but my problem is: I have 10bit monitor (real 10bit, not 8bit + 2 bit dithering) and I would like to use it at 10 true bit.

Only few monitors (until now) are true 10bit, most are 8+2frc=10 bit: obviously 10 true bit monitors are more expensive but the difference is clear.

So, I need to be sure to choose an appropriate graphic card.
 
Ok but my problem is: I have 10bit monitor (real 10bit, not 8bit + 2 bit dithering) and I would like to use it at 10 true bit.

Only few monitors (until now) are true 10bit, most are 8+2frc=10 bit: obviously 10 true bit monitors are more expensive but the difference is clear.

So, I need to be sure to choose an appropriate graphic card.

Vega 56 is a safe choice is you are using cMP Mac Pro. Prices are good now too.
 
Ok but my problem is: I have 10bit monitor (real 10bit, not 8bit + 2 bit dithering) and I would like to use it at 10 true bit.

Only few monitors (until now) are true 10bit, most are 8+2frc=10 bit: obviously 10 true bit monitors are more expensive but the difference is clear.

So, I need to be sure to choose an appropriate graphic card.

I have a true 10 bit monitor. When I was on 10.13.6, my Vega 56 was showing as 8bit output to it. But on 10.14, it is now showing as 10bit.
 
@walexago BenQ make some "cheep" 10 bit displays, at least comparatively cheep

the benq SW240 got a relay good review (10-bit aRGB, even has 14-bit look up table)
https://translate.google.com/transl...stiger-grafik-monitor-ueberzeugt/&prev=search

the benq PD2700Q is only sRGB but cheaper
https://translate.google.com/transl...richte/test-monitor-benq-pd2700q/&prev=search
still looks vary good

www.prad.de is the best display site i know, relay amazing review, new reviews are not in English but google can fix that ;)

even the rx 560/580 have 10 bit out. the rx 560 needs no extra power so if you have a spare pci slot you can just use it to drive the display, if you need if the gpu compute power is not needed

also if your doing video work black magic have some pci cards that have 10 bit video out
https://www.blackmagicdesign.com/uk/products/intensitypro4k
 
Last edited:
I use UP2718Q: true 10bit, 4k and HDR. It's expensive (I use it mainly for my work) but it's a wonderful monitor.

Then you're set with an AMD Rx580 or Vega card. With 10.14, they output 10bit natively, so as long as you are using a true 10bit monitor, then you'll be fine.

I'm using an even more expensive LG 31MU97-B 4K DCI monitor and it's showing 10bit in system profile. As I do video, I also have a true 10bit signal coming out of a Kona 4 PCIe I/O card going into a Flanders Scientific monitor. Now that costs a ton more money... more than I'd certainly like it to be :(
 
  • Like
Reactions: orph
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.