Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This problem should be non-existent with Apple‘s own displays like the Studio Display or the XDR or built-in MacBook screens, since they‘re using a rather impressive HDR color management system called „EDR with their own displays. That allows for completely seemless mixing of HDR and SDR content, without showing SDR as washed out.
 
This problem should be non-existent with Apple‘s own displays like the Studio Display or the XDR or built-in MacBook screens, since they‘re using a rather impressive HDR color management system called „EDR with their own displays. That allows for completely seemless mixing of HDR and SDR content, without showing SDR as washed out.
I know, but since the whole mac studio thing is based on being used with whatever mouse, keyboard or display you like...this makes very little sense.
An Apple computer should be able to display accurate HDR with non-apple HDR displays, especially since the only true HDR apple display is beyond the reach of 90% of creative professionals (the Studio Display XDR)
 
  • Like
Reactions: flybass
I have some off brand monitors that claim HDR10 support. With various Macs plugged into it via DP or C to C DP-Alt etc to test the differences. I've gotten to display a satisfactory image with various settings. I've found the best things to verify is.

1. Is your display cable rated properly?
- For Example 4K @ 120Hz HDR10 with 4:4:4. You need a certified HDMI 2.1 or DP 2.0 Cable (1.4 w/ 4:2:2 or 4:2:0). That's 48Gbps bandwidth for HDMI at the full spec. DP 2.0 is needed to meet the full spec without knocking it down to 4:2:2 etc to get within the 32Gbps of 1.4. I've gone through several DP, HDMI, and Thunderbolt cables. Some claim to support a spec and it turns out, they don't. Even from well known brands things become a lottery sometimes.

2. Is the Display capable of receiving HDR10 4K @ 120Hz blah blah? And displaying it correctly? Not some half-ass implementation, low quality panel and image processing? Is it verified to support DP 1.4, HDMI 2.1 and their successors?

3. Can the Device (Mac) send that kind of signal?
- Since most modern Macs come with Thunderbolt 4 ports. Total bandwidth is 40Gb/s or DP 1.4. So already you're subsampling at 4:2:2 Max if you're going for HDR10, 4K and 120Hz. That's before the GPUs driving capabilities come into question.

3a. Cool, you verified 1-3.. But what about the content? Is it truly HDR? Is the program you're viewing it through capable of showing it? I've run into Safari not having the VP9 turned on. I never touched that setting, but I had to dig into flags. Only for an unrelated macOS reinstall and it was on by default...

Normally 2 and 3 SHOULD be handled by the two devices during the initial pairing, but nothing works as it should sometimes. That is just the basic hardware verification before you dive into the software side of HDR. Is correct color profile loaded, does the monitor have its own HDR settings.

My monitors look great when everything is configured with a fine tooth comb. However, the moment someone needs to connect a laptop (which happens often). The monitors settings are universal across all inputs. So they have to go in and change it back to SDR and tweak the picture settings if they care enough and it's not a quick connect.

I personally went back to SDR. My office space is built around multiple people being able to quickly dock their laptops and go. The fuss wasn't worth the fancy picture quality.
 
It came up in a private message that I hadn't been clear what settings are used on an Asus PA32UCX to make it look correct in HDR mode:

ProArt Palette > Color > Gain > R (G, B) set to 100
Image > Input Range > Auto

A PA32UCX looks completely correct like this to me. Keep in mind that SDR may not be quite as bright in HDR mode still as without (might be more like 400 nits than 600, I'm not positive), but it will have proper brightness, contrast etc. compared to a standard 400 nit monitor next to it.

Is it a perfect solution? I'm not sure. I'm not really happy with micro-LED backlighting overall - it only looks good in video and to an extent in games. You get used to it, but with OLED improving constantly, I think it's a better option today.
 
The problem appears because macOS incorrectly detects the display color mode (what is not a color profile). It sets color mode to "Limited range" instead of "Full range," and that washes colors out. As for me it happens for every HDMI monitor. The solution is to download the BetterDisplay app, set the right color mode to the display, and force it to protect that configuration. That solves everything.
 
The problem appears because macOS incorrectly detects the display color mode (what is not a color profile). It sets color mode to "Limited range" instead of "Full range," and that washes colors out. As for me it happens for every HDMI monitor. The solution is to download the BetterDisplay app, set the right color mode to the display, and force it to protect that configuration. That solves everything.
I don't think that has anything to do with it in my case. For me, the "Limited range" color modes are chroma subsampled. The full range RGB looks basically identical. There may be other (HDR-related) settings in BetterDisplay that could help but that's not gonna be it IMO.
 
As for me, this is most noticeable if I open a movie in dark scenes, those are just cut off and hardly visible in limited range.
I think something else entirely is at play there. Chroma subsampling would affect the crispness of text in some circumstances, otherwise it shouldn't really have much of an impact. RGB looks better for sure, but video should be basically the same between RGB and 422 subsampled.

Again, for me, both available subsampled (422) color modes are marked "limited range" but RGB is not. This makes no impact on the color levels not being right for me.
 
  • Like
Reactions: IGHOR
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.