Are you plugging one monitor in the top TB3 ports and one in the back of the Mac Pro? You can't connect both on adjacenet TB3 ports.
Documented here:
https://support.apple.com/en-us/HT210319
https://support.apple.com/mac/mac-pro
Nothing documented about the 5700x yet but it'll have additional TB3 ports to connect additional displays.
It'll likely be similar to the Vega cards, there's documentation on which ports to connect depending the number and type of displays.
Connect the two 4K displays to the HDMI 2.0 outputs of the 580X. That should work but noone has tried it. AMD graphics cards usually support 6 displays. The LG UltraFine 5K counts as two. The XDR counts as two displays when connected to the 580X or Vega II or Vega II Duo cards (but no-one has posted the AGDCDiagnose/DisplayDiagnose outputs to prove that yet). That leaves room for two 4K displays.Looks like you can do two 5K's and two 4K's... can I do one 6K and two 4K's if the two 5K's don't work? I'm trying to rock a three monitor setup somehow without having to upgrade my graphics card...
you have the option of picking up a cheep PC GPU to run extra displays if needed, an RX 570 is a cheep option.
The LG could work at 4K and the XDR could work at 4K or 5K (not sure) using a Moshi USB-C to DisplayPort bidirectional cable.Neither the LG 5k or the XDR will work with a PC GPU in this case.
The LG UltraFine 5K counts as two. The XDR counts as two displays when connected to the 580X or Vega II or Vega II Duo cards (but no-one has posted the AGDCDiagnose/DisplayDiagnose outputs to prove that yet). That leaves room for two 4K displays.
However, the number of allowed connected displays may be reduced from 6 to a lower number if the total number of pixels is large enough. For example, the W5700X supports display stream compression (DSC) so that an XDR display counts as a single display but you can only connect 3 of them to the W5700X according to the specs (I would like to see someone try to connect more but who has 4 XDR displays?)
DSC also needs to be supported on the display, and I'd be very surprised if the XDR supports it. As a content creation display, lossy compression isn't exactly desirable.
It does. The ports on it actually run a USB2 speeds unless the GPU supports DSC.
DSC is advertised as visually lossless, which so far seems to be true. It just needs to be lossless to your eye, it's not a signal used for mastering material or anything.
You can get DSC to an XDR from a 5700 XT using a Moshi USB-C to DisplayPort cable. One person posted AGDCDiagnose output for that. I think someone tested the 5700W's USB-C port but not with an XDR yet, plus the card may cause crashes (maybe need to wait for 10.15.3 for full support?).Since you can't hook this up to a Mac Pro and work with it in DSC enabled (for the moment anyhow), I'd be curious if it is enabled for the W5700X and how it impacts things... or if the tech specs are right and Apple decided to disable DSC on the W5700X to avoid folks getting artifacting on the card?
The mtdd override files for the XDR show a tiled mode of 3006x3384 60Hz running at 649 MHz. That's 19.5 Gbps for 10 bpc. 38.9 Gbps total (two tiles) which fills up the Thunderbolt 3 bandwidth (40 Gbps) but it's not as bad as that. Stuffing symbols of DisplayPort are not included in Thunderbolt. The Thunderbolt controller recreates the stuffing symbols when it converts the Thunderbolt DisplayPort packets back into DisplayPort.The XDR at 60Hz, 10bpc needs just about 44 Gbps for the signal. DisplayPort 1.3/1.4 caps out at just over 32 Gbps. It needs two DisplayPort 1.4 connections to drive. It's why Macs that only have DisplayPort 1.2 aren't supported (including the 2018 Mac mini).
ae2e is interesting because it has a new flag DisableDSC set to 1.
With DSC, only HBR2 link rate is needed (17.28 Gbps) for 6K 12 bpc (1286.01MHz -> 61.73 Gbps) which is a compression ratio of 3.57:1 which seems too high? I only saw the 12 bpc option in a screenshot from Windows using an Nvidia RTX card. There's a document showing how 8 bpc supports 3:1 compression ratio and 10 bpc supports 3.75:1 so I guess it's ok.
I would like someone to post an AGDCDiagnose output for this 6K tiled mode. Then we can know for sure if the GPU is sending two HBR3 signals (required for 6K, non DSC) to the Titan Ridge Thunderbolt controller (which usually cannot support two HBR3 signals, except Apple Macs are doing it? and they made a firmware update for the Blackmagic eGPU to do it too?).
The surprising part is that this is a new property that has never existed before.Not surprising, TBH. No compression needed.
I think what I said is not wrong. 8bpc=24bpp compresses to 8bpp = 3:1. 10bpc=30bpp compresses to 8bpp = 3.75:1.3.75 is if you are compressing 30 bpp to 8 bpp. Don’t confuse bpc and bpp in these charts. They are very similar.
16" MBP supports DSC, so you can get up to 12 bpc.DSC will let you use bpp values other than 8 to achieve different levels of compression. So it’s more a question of which bpp modes Apple is using with something like the 16” MBP.
Actually, it seems that only the Apple Titan Ridge controllers and the Titan Ridge controller of the Blackmagic eGPU can take two HBR3 inputs (anyone have an AGDCDiagnose output to prove this?). I tried connecting two HBR3 displays to a GC-TITAN RIDGE, and only one can connect at HBR3 speed, the other connects at HBR speed.Titan Ridge can indeed take two HBR3 (DP1.3/1.4) inputs. Just look at the retail Titan Ridge card from Gigabyte with its two DP 1.4 inputs as an example.
You mean two HBR3 streams exceeds the TB3 bandwidth. But Thunderbolt does not transmit stuffing symbols so the full 25.92 Gbps of each HBR3 stream is not sent. Only something between 36.64 and 38.9 Gbps is sent over Thunderbolt (6K 60Hz 10bpc).The issue is more that two HBR3 streams can exceed the TB3 bandwidth.
I think the gotcha is that either the GPU or the macOS drivers has a limit. 6K is a lot of pixels - more than dual 4K which is more than 5K. While there is enough outputs from the GPU to support four displays, the GPU cannot support four 6K displays (or the macOS AMD driver has an artificial limit).What is surprising to me though, is that the MBP only supports two of these, despite having 4 DP connections, and using DSC as pointed out earlier. It should be able to drive 4 if it is using one of the DSC modes. Weird. I wonder if Titan Ridge can accept 2 HBR3 streams but can’t output them on separate ports without exceeding internal limits? Maybe that’s the gotcha? In which case, that also explains the 3 XDR limit on the W5700X, since it relies on TB3 controllers for output.
The surprising part is that this is a new property that has never existed before.
I think what I said is not wrong. 8bpc=24bpp compresses to 8bpp = 3:1. 10bpc=30bpp compresses to 8bpp = 3.75:1.
16" MBP supports DSC, so you can get up to 12 bpc.
Actually, it seems that only the Apple Titan Ridge controllers and the Titan Ridge controller of the Blackmagic eGPU can take two HBR3 inputs (anyone have an AGDCDiagnose output to prove this?). I tried connecting two HBR3 displays to a GC-TITAN RIDGE, and only one can connect at HBR3 speed, the other connects at HBR speed.
You mean two HBR3 streams exceeds the TB3 bandwidth. But Thunderbolt does not transmit stuffing symbols so the full 25.92 Gbps of each HBR3 stream is not sent. Only something between 36.64 and 38.9 Gbps is sent over Thunderbolt (6K 60Hz 10bpc).
I think the gotcha is that either the GPU or the macOS drivers has a limit. 6K is a lot of pixels - more than dual 4K which is more than 5K. While there is enough outputs from the GPU to support four displays, the GPU cannot support four 6K displays (or the macOS AMD driver has an artificial limit).
The support document https://support.apple.com/en-ca/HT210754 does not say the 6K displays need to be connected to different sides of the MacBook Pro (16 inch, 2019). That makes sense since each display only requires one DisplayPort signal (because DSC is used). It does say to connect them to different buses for best performance if possible but doesn't say it's necessary.
One strange thing in that support document:
Supports:5K, 5K5K, 4K, 4K, 4KHowever, it doesn't say it can support:5K, 5K, 4Keven though this option has less pixels than 5K, 4K, 4K, 4K(I think both of those options assume the 5K display is not like the LG UltraFine 5K which is dual link SST and therefore requires both DisplayPort signals from a Thunderbolt controller/bus/side.
Right. AMD 5300M, 5500M, 5700 XT, W5700X, W5700.Doesn't really surprise me at all. Seems like DSC support at all is relatively new to macOS, and only enabled for Navi, yes? So I'd have expected the property to show up in 10.15.2 with the Navi-dependent DSC support.
I'm pretty sure it's 12 bpc. It's in the EDID of the overlay in the ae22 mtdd override file (the ae2e mtdd file has an EDID overlay with 10 bpc). 12 bpc is also in the AGDCDiagnose and DisplayDiagnose outputs.And are you talking 12 bpc or bpp in relation to the 16" MBP? That's what I mean by be careful. macOS doesn't use 12 bpc for frame buffers, so 12 bpc doesn't mean anything useful here. Out of curiosity, I've been reading the XDR thread more for details, and I did notice your post talking about "12 bpc" in relation to the XDR hooked up to a 16" MBP, and reading through the log, I do think it was reporting 12 bpp for the link rate. Since units are not provided, I'd expect it to be in the context of the mode it's operating in, since it is more an engineering debugging tool (I've written my fair share like these). Especially since the numbers are coming from the link data, rather than frame buffers. So the 16" MBP is at least operating in a compression mode of 2.5:1, which is much better than the 3.75:1 number you are repeating.
I didn't notice that before.And I just noticed that AGDCDiagnose even spits stuff out wrong: "Link Bits 30 bpc"
Whoops. That should be 30 bpp. Units are fun.
Maybe the GPU isn't using a strong enough voltage for the switch. I'm not sure how any diagnostic output is going to help with a bad connection. What is the model of the DisplayPort switch? Are you using it with HBR2 or HBR3?EDIT: I will say that I will find some use of the tools you've mentioned though. I wonder if it would give me any useful information about some link stability issues I have with my setup depending on how I connect up a DisplayPort switch to an eGPU (requires both to reproduce, and only on certain ports of the switch, even more fun).
Yes, I think Apple is using different firmware than the default that we get with GC-TITAN RIDGE or with PCs with Titan Ridge controllers, (or new Intel CPUs with built in Thunderbolt controllers except those support DSC so it's not a problem there).Which is weird, since Apple's using the same chips. The only real answer there is Apple might have a firmware fix that the Gigabyte card doesn't have for the JHL7540 (wouldn't surprise me). The chip itself is obviously capable of it if Apple is doing it. It's not like Apple has custom TB3 controllers here.
I think maintaining and using a single frame buffer is more efficient and performant. What might be interesting is a frame rate comparison for a game showing the difference between a dual link or stream and single link or stream display at 4K (old 4K MST displays), 5K, and 6K (the tests should avoid scaling and pixel format differences).It's possible that there's a limit as you say. I haven't worked closely with GPUs in years, so it is possible we're seeing limits to the line buffers that put an upper limit on how big the screens can be in total as well. But assuming that's true, there's not much point in enabling DSC for the XDR in that configuration either, other than to support all three displays being hooked up to the GPU's TB ports. If you have 3 XDRs, you already have enough bandwidth to feed them an uncompressed signal.
But really, it makes a lot of sense to keep DSC disabled when using a W5700X and a 6K XDR. Whichever of us is right about why the limit exists. Or even a compromise of only using DSC if it detects it needs to do so in order to support the displays as they are connected to the system. That would certainly be "user friendly" if not intuitive.
Even though the Dell UP3218K (2017) was released after DisplayPort 1.4 (2016), GPUs didn't support DSC until later (Navi/RTX? nothing previously?). DisplayPort 1.4a with DSC 1.2a was released in (2018) but I don't think more than DSC 1.2 would be required for an 8K display with the Dell UP3218K specs (not HDR). I think the Dell works with DisplayPort 1.3 also at 8K.At this point though, even Dell has shrugged at DSC for their 8K display aimed at content creation, opting instead for dual link. Makes me wonder how long before we start seeing DSC-capable 8K displays, and how things shake out in terms of favoring DSC vs dual-link for consumer vs production outside the Apple ecosystem. Since so far it seems like this stuff is still very much in its infancy as an actual shipping technology, and Apple may be one of the first actually using DSC.
Yup, not many 5K single link displays out there (it's not DSC so it's only 8 bpc because DisplayPort 1.4 only has 50% more bandwidth than DisplayPort 1.2 and therefore only 75% the bandwidth of dual DisplayPort 1.2 which allows 10 bpc on the LG UltraFine 5K).I'm not too surprised by an oversight like this. It took something like the Planar IX2790 for a 27" 5K panel to get matched up to a DisplayPort 1.4 connection anyways. It's borderline grey market kit in the US as it is on top of that.
I'm pretty sure it's 12 bpc. It's in the EDID of the overlay in the ae22 mtdd override file (the ae2e mtdd file has an EDID overlay with 10 bpc). 12 bpc is also in the AGDCDiagnose and DisplayDiagnose outputs.
#11
In the two DisplayDiagnose outputs that I've seen, it's called "BITDEPTH". In the case where HBR2 DSC is used, BITDEPTH is 12. In the case where dual HBR3 is used, BITDEPTH is 10. Maybe the DSC compression algorithm can do a better result with 12 bpc input.
As for frame buffer, I think you're right. I don't think I've seen a 36 bit frame buffer. Only "30-Bit Color (ARGB2101010)" and "24-Bit Color (ARGB8888)".
Maybe the GPU isn't using a strong enough voltage for the switch. I'm not sure how any diagnostic output is going to help with a bad connection. What is the model of the DisplayPort switch? Are you using it with HBR2 or HBR3?
I think maintaining and using a single frame buffer is more efficient and performant. What might be interesting is a frame rate comparison for a game showing the difference between a dual link or stream and single link or stream display at 4K (old 4K MST displays), 5K, and 6K (the tests should avoid scaling and pixel format differences).
The W5700X will probably use DSC like the rest of the Navi cards.
Even though the Dell UP3218K (2017) was released after DisplayPort 1.4 (2016), GPUs didn't support DSC until later (Navi/RTX? nothing previously?). DisplayPort 1.4a with DSC 1.2a was released in (2018) but I don't think more than DSC 1.2 would be required for an 8K display with the Dell UP3218K specs (not HDR). I think the Dell works with DisplayPort 1.3 also at 8K.