Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I ran the Nvidia firmware updater and it did indeed find it needed to update, which I let it do. There didn't seem to be any change in Windows afterwards. Link rate still capped at 5.4.

I'll see how viable turning this PC into a Hackintosh would be, to run those AGDCDiagnose tools.

I have started gaming on the Pro Display XDR in 4K 10-bit and the colors are deliciously rich, even in the 10 year old game engine of the MMO I play. It's a very immersive monitor, even for a task outside its design spec.

If you think this crotchety 1080 Ti setup would be useful for the custom EDID, feel free.
If you try the hackintosh, the latest macOS version to support Nvidia 1080 Ti is High Sierra. I have a couple Hackintoshes with Nvidia cards (Maxwell Titan X and Pascal 1070). Saddly, no macOS version supports RTX.

Some games will work fine at 6K with lowered settings.

What format do you need the EDID in? You said it was a RegEdit thing so I guess a sequence of hex digits will do. I'll work on it tomorrow.

With the link rate capped at 5.4 Gbps, to get 6K 60Hz, you'll need YCbCr422 6bpc or YCbCr420 8bpc.
You will either get an image, or get strange colors, or get no image.
 
The hex for the modified EDID is at the top of chroma_test_modified_edid-decode.txt.

I used my EDIDUtil.sh script to make some of the changes. First, you load the EDID with one of loadagdcfile, loadswitchresxfile, loadstring, or whatever. List the EDIDs with listedids. Choose one to edit using useedidnum. Make the changes, then then dump out the result. A variable named theedid contains the result. Use decode to decode it fully (requires edid-decode).

The commands I used were these:
Code:
source EDIDUtil.sh # loads the commands into memory
loadagdcfile AGDCDiagnose.txt # use loadagdc instead if you have the display connected
listedids
useedidnum 1 # change this to the number that you actually want to edit
clearserialnumber
dumpedid > chroma_test_original_dumpedid.txt
decode > chroma_test_original_edid-decode.txt
oldbyte=$((0x${theedid:0x14*2:2})); replacebytes 0x14 $(printf "%02x\n" $(((oldbyte & ~0x70) | (((12-4)/2) << 4) ))) # set bpc to 12
addchromasubsampling
adddisplayidblock -1 2600090f0f070700007f0000
dumpedid > chroma_test_modified_dumpedid.txt
decode > chroma_test_modified_edid-decode.txt

# use BBEdit to compare before and after
bbdiff chroma_test_original_dumpedid.txt chroma_test_modified_dumpedid.txt
bbdiff chroma_test_original_edid-decode.txt chroma_test_modified_edid-decode.txt

Things to try (assuming HBR2 connection):
  • RGB,YCbCr444 (YCbCr444 is similar enough to RGB that the image should only have incorrect colors if YCbCr is not supported).
    • 2560x1440 (up to 12 bpc)
    • 4K (up to 10 bpc)
    • 5K (up to 6 bpc)
    • 6K (no 6K)
  • YCbCr422
    • 4K (up to 12 bpc)
    • 5K (up to 8 bpc)
    • 6K (there is no 6 bpc option for chroma subsampling modes, so 6K isn't possible?)
  • YCbCr420
    • 5K (up to 12 bpc)
    • 6K (8 bpc)
 

Attachments

  • chroma_test.zip
    9.4 KB · Views: 278
I feel like a caveman throwing sticks at a monolith! You remember I mentioned how I got the official EDID in the first place - by copying it from the EDID entry in the registry when it was correctly supplied after using integrated graphics?

Well for the life of me I couldn't work out to copy and paste hex codes from the custom EDID .txt into the registry editor, so I ended up manually typing in the hex codes!

Anyway, here is the custom EDID as it appears to Monitor Asset Manager.

BD8247AB-DDB2-4600-8EDD-E99315DE149F.jpeg


I couldn't get any of the chroma modes to appear in the Nvidia control panel. Expected, or have I screwed something up?

B1EF8CF9-3160-4EFE-BF00-37C05B607F9F.jpeg
 
I feel like a caveman throwing sticks at a monolith! You remember I mentioned how I got the official EDID in the first place - by copying it from the EDID entry in the registry when it was correctly supplied after using integrated graphics?

Well for the life of me I couldn't work out to copy and paste hex codes from the custom EDID .txt into the registry editor, so I ended up manually typing in the hex codes!
The hex codes are simple text. You could paste them into a text editor and do search and replace to convert to the correct format if it's not already in the correct format.

Anyway, here is the custom EDID as it appears to Monitor Asset Manager.

I couldn't get any of the chroma modes to appear in the Nvidia control panel. Expected, or have I screwed something up?
The file chroma_test_modified_edid-decode.txt has five EDID blocks, each 128 bytes, for 640 bytes total. Your raw data is showing only 256 bytes and only the first 128 of them are filled in. I think moninfo.exe won't understand all the blocks, but it should be able to show the EDID bytes. I'll do some tests in Windows.

edited: corrected count of EDID blocks and total bytes.
 
Last edited:
Well for the life of me I couldn't work out to copy and paste hex codes from the custom EDID .txt into the registry editor, so I ended up manually typing in the hex codes!
Yes, Registry Editor is horrible. I thought there was copy and paste, but there isn't (not for the EDID binary data anyway).

I tried the using a modified INF file (created by Monitor Asset Manager moninfo.exe end edited with my EDID changes as described at https://docs.microsoft.com/en-us/windows-hardware/drivers/display/overriding-monitor-edids but I got the error you ran into ( driver not signed).

So I tried some methods to get around that error:


And some methods to install the inf:

But the inf installation didn't seem to affect anything (or I was doing it wrong or something was missing).

The best way to install EDID override is:
It's also a pretty good EDID editor (it doesn't know all the blocks but at least it won't modify them except if you want to delete them). The only problem I see with it is that it only supports up to 3 extension blocks. 4 extension blocks total. The Apple Pro Display XDR has 4 extension blocks, 5 extension blocks total.

To use it:
  1. Make changes to the EDID, or import from .bin (attached below).
  2. Press OK to save the EDID to the registry (or press Cancel to not save changes).
  3. Run restart64.exe to use the EDID without restarting the computer.
I applied the modified XDR EDID to my Dell display. I know it works because the name of the display was changed to "Pro Display" like it is in the modified XDR EDID. Nvidia control panel shows YCbC422 but not YCbCr420. I guess I still need to figure that out (does Nvidia support 4:2:0? does it support that on DisplayPort?)

Anyway, here is the custom EDID as it appears to Monitor Asset Manager.
Monitor Asset Manager moninfo.exe doesn't seem to always be able to show the other extension blocks even though CRU is able to edit all the extension blocks. I guess I would trust CRU more. It has an option to export the EDID as bin so it can be easily decoded by edid-decode on macOS.
 
Okay, I think I got your modified EDID installed correctly this time, using Custom Resolution Utility.

It exposed a lot more modes in the Nvidia Control Panel.

FD47B332-DC7D-4D68-B1BC-086D5398BF12.jpeg


Unfortunately, just about all of the new modes resulted in either a black screen or a green & purple display in the case of YCbCr444 (which as you said is close enough to RGB to still show something).

Does this conclusively prove that the chroma modes are not supported by the Pro Display XDR?
 
I applied the modified XDR EDID to my Dell display. I know it works because the name of the display was changed to "Pro Display" like it is in the modified XDR EDID. Nvidia control panel shows YCbC422 but not YCbCr420. I guess I still need to figure that out (does Nvidia support 4:2:0? does it support that on DisplayPort?)
I forgot that DisplayPort 1.2 displays (my Dell UP2715K, P2415Q, and P2715Q) and DisplayPort 1.2 GPUs (Maxwell Titan X) do not support 4:2:0 chroma sub sampling except for their HDMI 2.0 ports.

I redid my test with an Acer XV273K (4K DisplayPort 1.4) and Nvidia GTX 1070 (supports DisplayPort 1.4). With the patched EDID, it added many options: 5K, 6K resolutions; RGB, YCbCr422, YCbCr420 color formats; 8,10,12 bit depths. GPU-Z shows HBR2 link rate when using 6K YCbCr420 8 bpc and HBR3 link rate when using 6K YCbCr420 12 bpc as expected. The Acer is a 4K display but it's onscreen menu shows that it's receiving 5120x2880 or 6016x3384 and shows an image that is flashing and very squished but has the correct colours and is almost usable (usable enough that I could click on stuff).

One interesting think about the XDR 4K timings is that they are not compatible with my Dell 4K displays that expect the normal 4K timings (either the CVT-RB timing or the HDMI 2.0 timing).

Unfortunately, just about all of the new modes resulted in either a black screen or a green & purple display in the case of YCbCr444 (which as you said is close enough to RGB to still show something).

Does this conclusively prove that the chroma modes are not supported by the Pro Display XDR?
Thank you very much. It seems chroma sub sampling is not an option for XDR (unless someone makes a chroma sub sampling to DSC converter - very unlikely).

One mystery remains. Your single cable connection was limited to HBR2 speed? You were unable to get 5K60 8bpc RGB or 4:4:4 (should be doable with HBR3)? Or 4K60 RGB/4:4:4 12 bpc? The DPCD dump included in the output of AGDCDiagnose in macOS will tell us how the display is advertising itself (I recently received AGDCDiagnose for dual HBR3 mode over Thunderbolt 3 - so I think I just need info for single cable non-DSC mode). If it advertises HBR2 speed even for DisplayPort 1.4 connection, then a DisplayPort override (instead of an EDID override) might be interesting.

When you're done experimenting with CRU, you can run the reset-all.exe program to undo the registry changes. Then restart64.exe to reload (or restart the computer).
 
One thing we didn't try was lower refresh rates.
With HBR2 link rate limit and no DSC, you can try:
5K60 6bpc (you've already seen that).
5K46 8bpc
6K30 8bpc
6K44 6bpc

8bpc = 16 million colors.
6bpc = 262 thousand color.
 
Installing Boot Camp drivers might enable 6K (if you have DSC) and enable brightness control and presets without requiring an EDID override:
#7
#108
 
  • Like
Reactions: henrymyf
Installing Boot Camp drivers might enable 6K (if you have DSC) and enable brightness control and presets without requiring an EDID override:
#7
#108
I installed Boot Camp package and it worked at 6K, with the ability to adjust the brightness. See my post, and the package download link is in #7 as joevt posted.
 
I have a corsair one desktop pc with 2080ti and cannot install the bootcamp drivers. So I have no idea how that laptop was able to natively install them? Also I bought the moshi displayport to usb-c cable based on someone's experience on this thread. And while it connects, I get tons of artifacts and frequent black screens. I might just try to get a new cable and see if it makes a difference but using a PC on this beautiful screen is starting to look pretty bleak 😢
 
I have a corsair one desktop pc with 2080ti and cannot install the bootcamp drivers. So I have no idea how that laptop was able to natively install them? Also I bought the moshi displayport to usb-c cable based on someone's experience on this thread. And while it connects, I get tons of artifacts and frequent black screens. I might just try to get a new cable and see if it makes a difference but using a PC on this beautiful screen is starting to look pretty bleak 😢
Did you try the Bootcamp drivers from the post I linked? (I have not tried them myself)
latest BootCamp software

Did you check the EDID? Is the EDID valid? If not, did you try overriding the EDID using CRU?
 
EDID is correct. I tried the drivers and get the same "This version of bootcamp is not intended for this computer model" I ordered a new moshi cable coming tomorrow. Hopefully the issues that I've been having is QC.
 
EDID is correct. I tried the drivers and get the same "This version of bootcamp is not intended for this computer model" I ordered a new moshi cable coming tomorrow. Hopefully the issues that I've been having is QC.
Dear ksc323,

I tried to repeat the installation process and I found out that there is a little detail that I have forgotten to mention:

Don't double click on the "Setup.exe" file in /BootCamp6.1.7748/BootCamp/, but instead run the "BootCamp.msi" file in /BootCamp6.1.7748/BootCamp/Drivers/Apple/. This should work.

Here I'm attaching two screen shots as a proof (I also changed the display language as English for convenience). As you could see, that unlike a real Mac, there are only two tabs that are displayed in the Boot Camp Control Panel, which are "Startup Disk" and "Display", where (in the second tab) you could adjust the brightness and change the presets.

Please let me know if this works. I really hope that I'm not the only lucky one. -Henry

tab1.PNG

tab2.PNG
 
Last edited:
  • Like
Reactions: joevt
I can confirm that Belkin 3.1 USB-C™ to USB-C Cable (100W) (USB Type-C™) works with 2080 Ti VirtualLink port.
It works in 6k, 4:4:4, RGB, and up to 12 bit depth.
Display's internal usb hub also works at usb 2.0 speeds.
Brightness control and profile selection also functions in windows 10 with bootcamp drivers over this cable.

This cable is too short for me: 1m.
Please list longer USB-C cables with the same properties.

The box features the following marks:
"4K"
10Gbps
SuperSpeed+
 
I can confirm that Belkin 3.1 USB-C™ to USB-C Cable (100W) (USB Type-C™) works with 2080 Ti VirtualLink port.
It works in 6k, 4:4:4, RGB, and up to 12 bit depth.
Display's internal usb hub also works at usb 2.0 speeds.
Brightness control and profile selection also functions in windows 10 with bootcamp drivers over this cable.

This cable is too short for me: 1m.
Please list longer USB-C cables with the same properties.

The box features the following marks:
"4K"
10Gbps
SuperSpeed+

Does the 2m Thunderbolt cable that comes with the XDR not work?
 
Does the 2m Thunderbolt cable that comes with the XDR not work?
No, it doesnt work. Cables that DO NOT WORK with 2080ti virtuallink so far:
  • Original Apple Pro XDR cable (Thunderbolt 3)
  • Macbook Pro 16'' (2019) cable
  • Satechi USB-C to USB-C 100W 2m cable
 
I'm curious that in theory, is there any difference between the Belkin 3.1 USB-C™ to USB-C Cable and the rest three mentioned cables?
  • Active Thunderbolt cables are not compatible with SuperSpeed USB/DisplayPort uses. Someone else said the XDR cable does work from AMD W5700 USB-C port but I'm not 100% convinced #12 ...
  • I don't know what a MacBook Pro 16" 20019 cable is. Is it only for charging?
  • The Satechi cable is only for charging (it supports only USB 2.0).
You need a cable that support USB 3.1 gen 2 or DisplayPort 1.4.
 
I have a corsair one desktop pc with 2080ti and cannot install the bootcamp drivers. So I have no idea how that laptop was able to natively install them? Also I bought the moshi displayport to usb-c cable based on someone's experience on this thread. And while it connects, I get tons of artifacts and frequent black screens. I might just try to get a new cable and see if it makes a difference but using a PC on this beautiful screen is starting to look pretty bleak 😢

Hi ksc323! I am Daniel and I work for Moshi. It is always great to see honest feedback, may I ask if you already contacted our Customer Service team at support@moshi.com?
We will be happy to assist you in regarding this situation and if it is necessary, provide you a replacement.

Looking forward to hearing from you soon!
Cheers
 
  • Active Thunderbolt cables are not compatible with SuperSpeed USB/DisplayPort uses. Someone else said the XDR cable does work from AMD W5700 USB-C port but I'm not 100% convinced #12 ...
  • I don't know what a MacBook Pro 16" 20019 cable is. Is it only for charging?
  • The Satechi cable is only for charging (it supports only USB 2.0).
You need a cable that support USB 3.1 gen 2 or DisplayPort 1.4.

Dear joevt, may I ask several dummy questions?

1. The DSC is a means of compression, will it affect the quality of the image? If so, in what way? I guess the DSC engineers should try to make such influence as invisible as possible, right?

2. How come a Thunderbolt 3 could achieve that two HBR3 channels are transmitted in one physical cable? I know in the old days, the Dell 5K monitor always require two cables. Is this also realized by DSC technology? Or any other technology?

3. The Pro Display XDR is always using single Thunderbolt 3 cable (only one port) for the imaging, even if it's using the latest graphics card in the Mac Pro. I wonder if the XDR is always using the DSC technology? If there is some image sacrifice, why doesn't Apple offer a way to use full raw data transmititon, without any compression, from the graphics card to the monitor (aka double cables)? IMO that's the way to charish the superb panel of XDR, at least I would love that option, even if it's just a psychological level of feeling..

Apoligize if these questions are too superficial. I just wanna know what hell happened here..

Thanks.
 
1. The DSC is a means of compression, will it affect the quality of the image? If so, in what way? I guess the DSC engineers should try to make such influence as invisible as possible, right?
DSC is visually lossless. It's a lot more complicated than YCbCr 4:2:0 and has better compression.

2. How come a Thunderbolt 3 could achieve that two HBR3 channels are transmitted in one physical cable? I know in the old days, the Dell 5K monitor always require two cables. Is this also realized by DSC technology? Or any other technology?
Thunderbolt controllers have always had two DisplayPort inputs internally (maybe not Thunderbolt 1). For example, my Thunderbolt 2 2015 MacBook Pro has a controller that has two inputs. The Thunderbolt controller takes the DisplayPort signals and sends them as Thunderbolt packets which are converted back to DisplayPort by the Thunderbolt controller in the display.

The Dell 5K requires two DisplayPort 1.2 (HBR2) signals to get 5K. The same is true for the LG UltraFine 5K display (two DisplayPort 1.2 signals over Thunderbolt 3) and the iMac 27 inch 5K retina display. These displays do not use DSC.

The Dell 8K requires two DisplayPort 1.4 (HBR3) cables to get 8K. This is too much data for Thunderbolt 3 so you can't use a Thunderbolt 3 dock to connect both DisplayPort cables.

The Apple 6K displays also requires two DisplayPort 1.4 (HBR3) signals to get 6K (when the GPU doesn't support DSC) but the signals can fit inside Thunderbolt 3 because 6K does not use the entire bandwidth of the HBR3 connections - there are DisplayPort stuffing symbols that are removed by the transmitting Thunderbolt 3 controller and recreated by the display's Thunderbolt 3 controller.

3. The Pro Display XDR is always using single Thunderbolt 3 cable (only one port) for the imaging, even if it's using the latest graphics card in the Mac Pro. I wonder if the XDR is always using the DSC technology? If there is some image sacrifice, why doesn't Apple offer a way to use full raw data transmititon, without any compression, from the graphics card to the monitor (aka double cables)? IMO that's the way to charish the superb panel of XDR, at least I would love that option, even if it's just a psychological level of feeling..
XDR does not use DSC when connected to graphics cards that do not support DSC.

That's a good question about disabling DSC. I guess DSC has good enough quality, and using DSC allows higher USB speed.
 
The Dell 8K requires two DisplayPort 1.4 (HBR3) cables to get 8K. This is too much data for Thunderbolt 3 so you can't use a Thunderbolt 3 dock to connect both DisplayPort cables.

The Apple 6K displays also requires two DisplayPort 1.4 (HBR3) signals to get 6K (when the GPU doesn't support DSC) but the signals can fit inside Thunderbolt 3 because 6K does not use the entire bandwidth of the HBR3 connections - there are DisplayPort stuffing symbols that are removed by the transmitting Thunderbolt 3 controller and recreated by the display's Thunderbolt 3 controller.

XDR does not use DSC when connected to graphics cards that do not support DSC.

That's a good question about disabling DSC. I guess DSC has good enough quality, and using DSC allows higher USB speed.

Thanks for the reply. I still don't quite understand the relationship between "DSC supporting" and "one tb3 cable with two HBR3":

1. If you want 5K resol., then you only need two HBR2, or two DP 1.2, no DSC, no problem;
2. If you want 8K resol., then you need two HBR3, or two DP 1.4, and it has to be two physical DisplayPort cables, no matter if you have DSC;
3. If you want 6K resol., then you still need two HBR3, or two DP 1.4, but the dataset is small enough to fit into the single TB3 cable, no matter if you have DSC;
4. In the case of 6K, whether supporting the DSC functionality or not only determines if the rest bandwith of the Thunderbolt3 cable supports the high speed USB transmittion.

Are the above statements correct?
 
Thanks for the reply. I still don't quite understand the relationship between "DSC supporting" and "one tb3 cable with two HBR3":
You have to think about what is supported by the GPU and what is supported by the display.

Intel GPUs support DisplayPort 1.2 (Ice Lake supports DisplayPort 1.4 and DSC but no Apple Mac uses that yet).
AMD GPUs before Navi support DisplayPort 1.4 without DSC (and Nvidia GPUs before RTX).
AMD GPUs starting with Navi support DisplayPort 1.4 with DSC (and Nvidia GPUs starting with RTX).

If you're including Thunderbolt, then some Thunderbolt controllers (Alpine Ridge, Falcon Ridge) support only DisplayPort 1.2 and some support DisplayPort 1.4 (Titan Ridge).
If you have a DisplayPort 1.4 GPU connected to an Alpine Ridge Thunderbolt controller (such as early Thunderbolt 3 Macs), then you can only get DisplayPort 1.2 out of the Thunderbolt controller.

1. If you want 5K resol., then you only need two HBR2, or two DP 1.2, no DSC, no problem;
Right. DP 1.2 supports HBR2. You can have a display with a Thunderbolt 3 connection that uses two HBR2 connections over Thunderbolt 3 to get 5K 10 bpc. Some PCs only have one connection to the Thunderbolt 3 controller. In that case, you would want a display that support 5K 8bpc with a single DisplayPort 1.4 (HBR3) connection such as the Ilyama 5K display, or you want a display that supports DSC (The only display I know that supports DSC is the XDR). An Nvidia GPU in Windows can deliver 5K 6 bpc using HBR2. I don't think macOS supports 6 bpc. I believe the LG UltraFine 5K and the Dell 5K do not support 5K from a single DisplayPort connection even at lower bit depths or refresh rates.

2. If you want 8K resol., then you need two HBR3, or two DP 1.4, and it has to be two physical DisplayPort cables, no matter if you have DSC;
That is true for the Dell UP3218K for 8K60. It also supports single cable 8K30. 8K60 can also be done using single HBR3 YCbCr 4:2:0 but I don't know if the Dell supports that mode. There may exist 8K TVs that support DSC.

3. If you want 6K resol., then you still need two HBR3, or two DP 1.4, but the dataset is small enough to fit into the single TB3 cable, no matter if you have DSC;
You don't need two HBR3 if the GPU supports DSC. If the GPU supports DSC, then it only needs HBR2 to get 6K60 (but the GPU still needs to support DSC which is part of DisplayPort 1.4).

4. In the case of 6K, whether supporting the DSC functionality or not only determines if the rest bandwith of the Thunderbolt3 cable supports the high speed USB transmittion.
Yes, with DSC, you only need HBR2 to get 6K60 leaving plenty of bandwidth on the Thunderbolt cable for USB 3.0. The connection has to be Thunderbolt 3 40 Gbps to get that. If the connection were only USB-C (DisplayPort alt mode instead of Thunderbolt alt mode), then the 4 lanes would be used by the HBR2 connection leaving only the USB 2.0 line for USB data.[/QUOTE][/QUOTE]
 
  • Like
Reactions: henrymyf
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.