Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This problem should be non-existent with Apple‘s own displays like the Studio Display or the XDR or built-in MacBook screens, since they‘re using a rather impressive HDR color management system called „EDR with their own displays. That allows for completely seemless mixing of HDR and SDR content, without showing SDR as washed out.
 
This problem should be non-existent with Apple‘s own displays like the Studio Display or the XDR or built-in MacBook screens, since they‘re using a rather impressive HDR color management system called „EDR with their own displays. That allows for completely seemless mixing of HDR and SDR content, without showing SDR as washed out.
I know, but since the whole mac studio thing is based on being used with whatever mouse, keyboard or display you like...this makes very little sense.
An Apple computer should be able to display accurate HDR with non-apple HDR displays, especially since the only true HDR apple display is beyond the reach of 90% of creative professionals (the Studio Display XDR)
 
I have some off brand monitors that claim HDR10 support. With various Macs plugged into it via DP or C to C DP-Alt etc to test the differences. I've gotten to display a satisfactory image with various settings. I've found the best things to verify is.

1. Is your display cable rated properly?
- For Example 4K @ 120Hz HDR10 with 4:4:4. You need a certified HDMI 2.1 or DP 2.0 Cable (1.4 w/ 4:2:2 or 4:2:0). That's 48Gbps bandwidth for HDMI at the full spec. DP 2.0 is needed to meet the full spec without knocking it down to 4:2:2 etc to get within the 32Gbps of 1.4. I've gone through several DP, HDMI, and Thunderbolt cables. Some claim to support a spec and it turns out, they don't. Even from well known brands things become a lottery sometimes.

2. Is the Display capable of receiving HDR10 4K @ 120Hz blah blah? And displaying it correctly? Not some half-ass implementation, low quality panel and image processing? Is it verified to support DP 1.4, HDMI 2.1 and their successors?

3. Can the Device (Mac) send that kind of signal?
- Since most modern Macs come with Thunderbolt 4 ports. Total bandwidth is 40Gb/s or DP 1.4. So already you're subsampling at 4:2:2 Max if you're going for HDR10, 4K and 120Hz. That's before the GPUs driving capabilities come into question.

3a. Cool, you verified 1-3.. But what about the content? Is it truly HDR? Is the program you're viewing it through capable of showing it? I've run into Safari not having the VP9 turned on. I never touched that setting, but I had to dig into flags. Only for an unrelated macOS reinstall and it was on by default...

Normally 2 and 3 SHOULD be handled by the two devices during the initial pairing, but nothing works as it should sometimes. That is just the basic hardware verification before you dive into the software side of HDR. Is correct color profile loaded, does the monitor have its own HDR settings.

My monitors look great when everything is configured with a fine tooth comb. However, the moment someone needs to connect a laptop (which happens often). The monitors settings are universal across all inputs. So they have to go in and change it back to SDR and tweak the picture settings if they care enough and it's not a quick connect.

I personally went back to SDR. My office space is built around multiple people being able to quickly dock their laptops and go. The fuss wasn't worth the fancy picture quality.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.