Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
Evidentially this is SUPER hard to explain as I'm on AppleCare "Senior Advisor" #3 who still doesn't understand the issue... but I'm going to try again...

The INTERNAL BUILT-IN SCREEN on my 2018 MacBook Pro 15" has HDR built-in and it's always on (there isn't even an option to turn it off). I can browse a HDR YouTube video in beautiful HDR, then immediately go back to writing a document in Apple Pages program; and the background of Pages is PURE white. I can watch another HDR video then immediately go back into Photoshop and the photo I'm working on has proper colors - everything is nice and vibrant, whites are white. It's beautiful.

BUT, if I plug an external, HDR-10 compliant monitor or 4K TV, to that SAME 2018 MacBook Pro 15", it does NOT work that same way. In Settings > Displays there's now a setting to turn HDR on or off on the external monitor. With HDR OFF every regular program looks like it should, but I can NOT watch HDR content in actual HDR................... When I turn HDR ON I can watch HDR content, but then EVERY OTHER (non-hdr) PROGRAM is muted - whites are gray, colors are not vibrant anymore. I can't believe Apple would cripple the functionality of external monitors this way (people can NOT toggle HDR on and off in settings constantly) - which is why I opened a ticket to attempt to get this fixed. It is completely asinine and unusable as-is.

Anybody have a fix for this? Anybody experiencing the same? Apple says they've NEVER, EVER heard of this issue... so if you do have it PLEASE open a ticket with Apple (you can open a ticket even if you don't have AppleCare, or if your Mac is way past warranty fyi - just as long as you're running the newest macOS Ventura you can open a ticket they told me), and/or report to Apple.com/Feedback.

Here's examples of how whites show-up with HDR on and off (please excuse my bad photo of the LCD screen and my bad photoshop to remove the text of the draft email I was typing in the one white window lol).

EAmrcY7.jpg


KaXVz3Z.jpg



_________________________________________
_________________________________________


Below is the NON-TLDR version lol, if you want more background info: I actually opened this AppleCare ticket up for my new 2023 Mac mini M2 Pro, since this problem still persists on that as well – it was just easier to explain it using my Macbook Pro as the example - which was also where I first experienced this problem on.

I've tried 3x TCL 4K HDR TV's, a Sony, Samsung, and HiSense 4K HDR's TV's too (usually 65-75" models - what my friends/family have - don't think I even know anybody who uses an actual computer monitor anymore -let alone one with HDR). I'm also pretty positive this happened on my LG Ultrafine? 4 or 5K? HDR monitor I had to give back to work a while ago. Tried numerous HDMI certified cables (which work on PS4 HDR and Xbox One HDR perfectly), and would have been using USB-C on the LG I believe. Happens the same if it's straight to my TV, or running through my Denon receiver (which handles HDR and 4K60 properly fyi). Formatted each computer from USB Stick (deleting partition before), so it's NOT a bad OS install.

A similar (or is it the same?) issue seems to be plaguing people since Catalina? I think. And those earlier ones were NOT only about HDR-capable external monitors - just regular external monitors too. Some have it only on HDMI, some (not all, whatsoever) find using USB-C fixes it. Seems to be something about how Mac's are throwing a YPbPr/YCbCr to monitors, when if they used RGBS instead this problem goes away? Key words are always "gray" or "grey" "washed out" "muted colors" etc. Even a post here about faking a EDID to combat a similar issue. It's such a big issue there's posts ALL over discussions.Apple.com (if anybody needs links, I have like ~10+ saved... added up all the posts and the "Me Too" button counts and there's 2,045 people with the same issue!), on reddit as well (even with a friggin GitHub to fix it)!

Also, and I never tried this but MANY reports that if you boot the effected Intel Mac into the Windows bootcamp partition, this problem goes away and all is perfect on the same external monitor... so it is NOT a hardware limitation, it's in the macOS programming somewhere.

One more weird thing is when I take a screenshot (like not with my phone like the pics above, but using command+3), with a version with HDR On, then another HDR Off, the picture in Photoshop (color picker) BOTH read pure white (RGB 255/255/255, Hex #FFFFFFFF), but to my naked eye it's like night and day! Apple of course asked for some video showing this... and with the camera white balancing and/or adjusting brightness for everything I could NOT capture it on film (even with my fancy DSLR)... So I literally broke out my X-Rite Colorimeter and took a white reading (in real time! - filming all this for Apple) with HDR off which was pure white, then once I switched on HDR the white turned to gray... even with having this 'scientific' proof on film... Apple seems to still not get it...
 
Last edited:
  • Like
Reactions: illumna97

iMacDragon

macrumors 68020
Oct 18, 2008
2,399
734
UK
It should be easy enough to capture with dsrl with manual everything set, and not letting camera auto anything.

But yes, I agree, that mac should have an option to adjust non hdr content brightness for external hdr displays in some fashion.
 

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
It should be easy enough to capture with dsrl with manual everything set, and not letting camera auto anything.
Couldn't capture it with a Sony A6400 and 16-55 F2.8 G lens, with everything set to manual... which is why I resorted to the colorimeter after that (which is better all around, because those are hard color-specific numbers being metered in real-time).

The two pictures I took tonight, and posted, were actually the best ones I have taken about this issue so far... (and they exaggerate it just a bit too much form reality if I'm to be honest... but it's still VERY noticeable and bad in person still).




But yes, I agree, that mac should have an option to adjust non hdr content brightness for external hdr displays in some fashion.
I think I might have failed (again) to properly explain this - sorry...

All I can further say is that it's not just about brightness. The actual colors of EVERYTHING (whites, grays, red, green, and blues) all shift with HDR off vs. on. Even the colorimeter (going by memory here) saw a change of about 12% just in blue between both of them (or something like that, forget the other values).

For instance you can have a beautiful brilliant kelly green, and even with the nits/candela backlighting it being way higher or lower you can still tell it's the same base color... this doesn't seem to be the case when HDR goes from off to on (on non-hdr content I'm talking about) - literally seems to be shifting colors (which is in-line with a misinterpreted color space/gamut... I think... if I understand that even remotely correctly - out of my league on that tbh).
 

iMacDragon

macrumors 68020
Oct 18, 2008
2,399
734
UK
I believe the white that looks grey is probably 100nits, which in context looks grey, whereas with hdr off, everything is probably 300-500, I recall exact same issue back with windows HDR, till they added SDR/HDR brightness balance that allows to boost the SDR levels to preferred level.

With the internal display, apple are automatically doing 500nits for sdr content, or equivalent scaling by brightness, with external I think they're following the signal spec for sdr being specieied to 100nits
 
Last edited:
  • Like
Reactions: wegster

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
I believe the white that looks grey is probably 100nits, which in context looks grey, whereas with hdr off, everything is probably 300-500, I recall exact same issue back with windows HDR, till they added SDR/HDR brightness balance that allows to boost the SDR levels to preferred level.

With the internal display, apple are automatically doing 500nits for sdr content, or equivalent scaling by brightness, with external I think they're following the signal spec for sdr being specieied to 100nits
There might be something in that. My X-Rite colorimeter in HCFR calibration software says that with HDR OFF my whites are 377 candela/m2 (think that is the same value as "nits"). With HDR turned on, viewing that same exact NON-hdr content, white is way lower to 177... so it drops nearly to HALF as bright...

Again, on the built-in MacBook Pro's screen, EVERYTHING is PERFECT!!!! - so they know how to do this right... it's just external monitors which seem to have this issue...
 
Last edited:
  • Like
Reactions: srgz

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
Hope it's ok with the Mods if I occasionally post here when I see other people having this issue (as I find them on this forum or randomly elsewhere). Kinda trying to get all the info together in one place, with hope that a fix will be forthcoming at some point.

Here's one from yesterday. They're using an eGPU, which makes it kinda unusual to begin with, but SAME EXACT problem and symptoms: Windows on the SAME computer works with HDR and SDR perfect, so it's not a hardware issue... but switch to macOS, turn on HDR in Display settings, and ALL non-HDR (SDR) content "just gets washed out" - just like me and all the others have experienced.

**Edit 2-17-23. Went to the Apple store today, they had a 2023 Mac mini M2 hooked up to their Studio Display... wanted to see if I could duplicate my issue on that, but alas, unlike ALL these other monitors I've tried there is NO "HDR" toggle for on or off - seems to always be on – so of course it worked as expected (just like the built-in screen on a MacBook always works for both HDR and SDR content properly).

**Edit 2-19-23. Got my own Hisense (75U8H) TV delivered (previously tried a 65" U8G? before), and same issue persists... trying to nail down which HDR-compliant USB-C to HDMI cable/adapter to get to see if that helps with this issue, since all I've used before is straight HDMI to HDMI cables.
 
Last edited:

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
Has there been any updates on this? (Experiencing same issues as OP)
None. Sadly, just like the other (literally 10x tickets I have escalated to 'senior engineer teams') Apple says this is how it's supposed to work... somehow... and that their Mac works perfectly, and it's an issue with my particular monitor/4K-TV (despite trying over 10 of them now, of multiple brands and models, including all the best selling highest-rated industry-gold-standard models of the last few years...).

Apple also had no comment when I told them the SAME EXACT Mac (my older Intel one) running Windows 10 Boot Camp does NOT have this issue and works perfectly with HDR and non-HDR content without having to manually switch a setting on and off every time...
 

Toutou

macrumors 65816
Jan 6, 2015
1,082
1,575
Prague, Czech Republic
My understanding is that this really is the expected behavior. Many HDR-capable-on-paper monitors just don't get bright enough, and in order to distinguish the many times higher brightness of the literal sun compared to say a piece of white paper (diffuse white) they have to make the regular white really dim. This works and looks good in a dark room where even the dim white looks bright to your eyes, so the max-brightness highlights really convey the message.

Apple's own displays don't suffer from this, because Apple controls the whole stack, the apps, the OS and the hardware, and they use a bunch of clever algorithms (they call this Apple EDR) in a way that basically lets you control how bright the UI white gets (that's the OS brightness setting) and then uses the brightness headroom (from the user-selected level to the actual max brightness of the screen) for HDR-content highlights.
 

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
My understanding is that this really is the expected behavior. Many HDR-capable-on-paper monitors just don't get bright enough, and in order to distinguish the many times higher brightness of the literal sun compared to say a piece of white paper (diffuse white) they have to make the regular white really dim. This works and looks good in a dark room where even the dim white looks bright to your eyes, so the max-brightness highlights really convey the message.
Sorry, but that's bullsh!t (don't mean you, mean Apple if that's really the blatant lie of an excuse they're using).

My main TV/monitor (and it's a budget one even...) hits 1,500 nits of brightness - measured with a calibrated colorimeter from multiple professional reviewers. Apple's $5,000.00 Pro Display XDR for example (which does normal and HDR content correctly) is 1,600 nits - and that's even pushing it for the monitor: it literally has a disclaimer in its specs saying it can only do that if the ambient temperature is below 75º F or something like that lol... Tons of TV models can hit similar contrast ratios as well, if not even greater than Apple's spec. There is NO reason, and NO excuse why other HDR-capable monitors can't be used and have both regular and HDR content appear correctly on the same screen...

Also, if you're talking about needing a certain level of (uber high) brightness required in order to have regular content display properly alongside HDR content... then how does my 2018 MacBook Pro 15" with its mere 500 nit max display (a mere THIRD of my non-apple external display) still have both regular and HDR content display properly? ...again, and I'm really not trying to be mean here at all, but your (or Apple's???) argument holds no water whatsoever.

Last, also STILL doesn't explain how on these same exact (intel) Mac's, when I'm running Windows through bootcamp on them, that Windows can run both regular content + HDR content perfectly fine ON THE SAME EXACT HARDWARE!!!
 

Toutou

macrumors 65816
Jan 6, 2015
1,082
1,575
Prague, Czech Republic
I don't know and wasn't able to find the actual brightness value of UI white that macOS outputs when in HDR mode, but I'd expect it to be somewhere between between 100-500 nits, which is not very bright, especially in a well-lit room. You see, when you plug an HDR display into a Mac, the display just says "I support HDR" but macOS has no idea about its real brightness, image postprocessing and lighting conditions in the room. So in order to be able to give HDR content the headroom it requires, I can imagine macOS playing on the safe side and setting SDR white to something low — remember that there are "HDR" displays with barely 600 nits of brightness.

And even for nicer, actual HDR monitors like the ones with 1500 nits of max brightness, you still want to give HDR as much headroom as possible, and at the same time you don't want to make SDR white too bright, because on a SDR photo of someone holding a piece of white paper (roughly what UI white should compare to) the paper can't look unnaturally blown out.

NO reason, and NO excuse why other HDR-capable monitors can't be used and have both regular and HDR content appear correctly on the same screen...
So this is kind of incorrect in a fuzzy, subjective way. The more similar your SDR highlights and HDR highlights look, the more you're doing HDR wrong. And the less similar you want them to look, the more you need higher max brightness and dimmer UI white.

Also, if you're talking about needing a certain level of (uber high) brightness required in order to have regular content display properly alongside HDR content... then how does my 2018 MacBook Pro 15" with its mere 500 nit max display (a mere THIRD of my non-apple external display) still have both regular and HDR content display properly?
Now with Apple-made displays, that's a different story. As I said in my previous post, the low-brightness MacBook displays don't display HDR properly or correctly most of the time, but they use many clever tricks that are just not technically possible for 3rd party displays in order to display as much HDR as possible.
The brightness slider on your MacBook controls roughly the brightness of UI white. When you're in a dim room with your SDR white set to 300 nits and HDR content appears on the screen, macOS' EDR technology will use the 500 nit capabilities of your display to push the HDR highlights to 500 nits, so they stand out against the 300 nit SDR white.
If you ramp up the brigthness to 100 %, SDR white will always shine at 500 nits and HDR highlights won't stand out anymore. If you lower the brightness to 250 nits, HDR highlights at 500 nits will look 2x brighter than your SDR highlights (afaik this is roughly the max EDR headroom you can get on a 500 nit display, if you lower your brightness even more, HDR content won't use the full 500 nit brightness anymore).

See this Apple Developer video for technical details: https://developer.apple.com/videos/play/wwdc2021/10161/
STILL doesn't explain how on these same exact (intel) Mac's, when I'm running Windows through bootcamp on them, that Windows can run both regular content + HDR content perfectly fine
As you now already now, Windows is not displaying HDR and SDR together correctly either, it just does it a different kind of wrong that looks better to you. Windows AFAIK (can't check, don't have) allows you to manually boost SDR content to levels that in the HDR context don't make sense (a piece of SDR paper as bright as a shining HDR lamp), but makes the consumption of SDR content possible even in brighter environments where their true brightness wouldn't be enough.
 

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
I don't know and wasn't able to find the actual brightness value of UI white that macOS outputs when in HDR mode, but I'd expect it to be somewhere between between 100-500 nits, which is not very bright, especially in a well-lit room.
So this is kind of incorrect in a fuzzy, subjective way. The more similar your SDR highlights and HDR highlights look, the more you're doing HDR wrong. And the less similar you want them to look, the more you need higher max brightness and dimmer UI white.
As you now already now, Windows is not displaying HDR and SDR together correctly either, it just does it a different kind of wrong that looks better to you.

Honestly, it doesn't matter if HDR is proper spec'd HDR or not, because there's currently a format war going on and there is NO, ZERO set standard for HDR. There's HDR, HDR10, HDR10+, Dolby Vision, HLG broadcast standard... even depending on the device you use (computer, chromecast, roku, Apple TV, Firestick, etc. etc.) it's going to be processed and used different. Hell, certain streaming apps only use ONE type, and/or can convert differently on the SAME app/service, depending on the device. Even Samsung themselves have pretty much banned Dolby Vision (one of the most popular for streaming services) on their TV's since (internet wisdom says) they didn't want to pay for the licensing... different studios are even choosing different formats: Lionsgate is using Dolby Vision, Warner Bros is using HDR10+ on their BR releases, etc. etc. etc. i.e. it's a crap-show.

NONE of this removes the FACT that Apple computers, on ANY external display (except their $5000 model) does NOT allow you to watch HDR content AND use your computer normally (i.e. regular non-HDR stuff - which is the vast majority of what we do). I.e. Apple does NOT do HDR properly. On my own screens, when measured with a calibrated industry-standard colorimeter the nits of non-SDR content, when HDR is turned on, is HALVED (per my 2nd? post about regular content, it's 377 nits with HDR off, 177 nits with SDR on). Apple is doing it wrong, period. It is UNUSABLE...

Apple is supposed to be, and many people use them for these reasons, the computer where the OS is supposed to be more intelligently designed, easier to use, more seamless, less hassle, things tend to "just work" without having to jump through hoops... Yet this is so blatantly broken there's not even a hoop to jump through in this regard - because it's so severely broken. If they didn't want to allow HDR, period, why would there even be a "HDR" slider in System Settings?

You can give all the excuses you want, but this is a fault of Apple, period. It is defective. They refuse to even admit it's a problem. Shame on them.
 

tonyoramos1

macrumors newbie
Apr 24, 2011
9
0
Dude, I don't think it's just the UI brightness...I am attached to an LG OLED TV right now and when I flip between the built in Netflix app and Netflix running on a Macbook Pro M2, the brightness is easily halved on the *despite using the exact same settings on the TV*.

This is ridiculous, not only that most users don't even know about this, but that you can't get true HDR out of an Apple Macbook unless you buy a $5000 Apple monitor. Unbelievable!

My whole UI is of course darker than if I turn off HDR in settings.
 

tonyoramos1

macrumors newbie
Apr 24, 2011
9
0
Guys, I figured this out: it was the cable. My amazon brand cable said it was rated for 4k60hz, high speed 18gpbs, but it clearly wasn't, b/c when I lowered the hertz to 30 suddenly the windows machine got brighter.

When I replaced the cable with an 8k capable 48gpbs ultra high speed cable, suddenly HDR works on my MacBook!

I am seriously wondering if companies are mislabeling cables out there and it's causing a lot of people problems.
 
Last edited:

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
Guys, I figured this out: it was the cable. "High speed" HDMI cables are not enough bandwidth; you literally need so-called "Premium High Speed Cables" which come with 3GB more bandwidth to get 4K 60 out of the Mac. Or just get an ultra high speed cable which supports 4K 120 / 8k

I figured this out when I tried a windows machine and it had the same problem of dim image, but, I set the Hz to 30 and suddenly the image got a little brighter after I saw somewhere on reddit people were using this as a solution.

I think everyone out there has bad cables b/c they just buy the most common HDMI 2.0 cable on Amazon not realizing they are not all the same. They are NOT.

You mean like these ones?

bGFJUsV.jpg


Which I have 4x of, meant for up to 8K@60 / 4K@120Hz, when I'm only running 4K@60... I also always pay extra for them to be "certified" – meaning they're actually tested to give the bandwidth they advertise. Wish the solution is as easy as new cables, but it's not...

Problem still persists on multiple different HDR-compatible screens, using multiple different cables, on multiple different HDMI inputs (as commonly only certain HDMI inputs give you the higher bandwidth), on multiple different supposedly HDR-compatible Mac's.
 

tonyoramos1

macrumors newbie
Apr 24, 2011
9
0
All I can tell you is replacing my Amazon cable that was rated for 4K 60 fixed my problem! I had to go up to a 48gbps cable, and was just logging on to correct my answer...but it turns out this may not work for you. That really sucks!

If I figure out anything else I'll let you know, does a windows machine get HDR for you? Do you have access for a windows machine to test? If you do, check out 30hz like I did, suddenly my brightness went up which is how I figured out that the cable was full of it, it lacked bandwidth for 4k60 even though it said it on the product description (Amazon brand)
 

srgz

macrumors regular
Aug 22, 2010
134
82
Any updates? Are we really the only ones using non-ProXDR HDR capable screens on MacOS? I've got an Asus ProArt PA32UCK and I just turn HDR off because I literally can not figure out how to make it look good with HDR turned on...
 

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
Any updates? Are we really the only ones using non-ProXDR HDR capable screens on MacOS? I've got an Asus ProArt PA32UCK and I just turn HDR off because I literally can not figure out how to make it look good with HDR turned on...
None. See my post #8 above (in a nutshell: Apple says all is working properly... when it's not)... and I can't personally open another ticket for this SAME issue - EVERYBODY here who has/is experiencing this should open tickets for this issue, otherwise Apple just won't care.

Problem still persists in Sonoma macOS too...
 

Toutou

macrumors 65816
Jan 6, 2015
1,082
1,575
Prague, Czech Republic
I still don't understand, what do you guys want the system to do? What's the correct way for you?

If your monitor does X nits of max brightness, in SDR mode obviously X can be your UI white, because that's the whitest brightest color you'll ever need to display. On the other hand, in HDR mode, X now has to become the brightest HDR highlight, and your UI white must go down, because perception-wise it doesn't make sense to make UI white as bright as the sun.
 
  • Like
Reactions: srgz

srgz

macrumors regular
Aug 22, 2010
134
82
I still don't understand, what do you guys want the system to do? What's the correct way for you?

If your monitor does X nits of max brightness, in SDR mode obviously X can be your UI white, because that's the whitest brightest color you'll ever need to display. On the other hand, in HDR mode, X now has to become the brightest HDR highlight, and your UI white must go down, because perception-wise it doesn't make sense to make UI white as bright as the sun.

Turning on HDR should NOT have any effect on how SDR content is displayed. It should look exactly the same as with HDR turned off. And it does, on Apple screens. But not on any other screens.

The “workaround” proposed by Asus etc is to only turn on HDR when you’re watching or editing HDR content, and even in that case it should always be full screen — which again, Apple’s displays don’t suffer from this restriction.
 
Last edited:

srgz

macrumors regular
Aug 22, 2010
134
82
“Apple does NOT do HDR properly.”

“ Yet this is so blatantly broken there's not even a hoop to jump through in this regard - because it's so severely broken. If they didn't want to allow HDR, period, why would there even be a "HDR" slider in System Settings?”
Apple are actually doing HDR *right* in my opinion — but sadly, only on their own displays and devices. I’ve researched this and talked to a few people, and now I get it.

The reason why I’m saying they’re doing it right — is because the HDR experience on MacOS with Apple’s screens is second to none.

On Apple’s own screens / devices:

1.) No HDR on/off toggle is needed
2.) SDR & HDR content can coexist side by side without altering SDR brightness or colors
3.) Managed color & consistent viewing experience across multiple devices and even display technologies
4.) Providing a best in class HDR experience on displays that don’t even have hardware support HDR

But in order to do what they do, they need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness of the display, and not just the display as a whole but each individual dimming zone in the case of their FALD enabled screens.

This is what allows them to be able to display SDR and HDR content side by side with both looking as it should, and also allows a consistent viewing experience across multiple different display technologies.

It also allows them to be able to show HDR graded / mastered content on displays that technically don’t have “official” HDR support, like on the Studio display, or older MacBook pros.

You can play back Dolby Vision content on the Studio Display, and it will undoubtably look better than pretty much every other VESA HDR600 certified display on the market.

The fact of the matter here is that there are no real industry standards for doing any of this, also — Windows (and Linux) is a complete **** show, so Apple made their own (superior) system, and unfortunately it requires their own displays in order for it to work the way they intended it to work.

I don’t believe they did this on purpose to lock out third party displays and be evil scumbags. They did it because there was really no other way to do it without compromising.

What Microsoft does as a workaround is gives you a brightness slider for SDR content when you turn on HDR.

Should Apple do this if you’re not using their displays?

Maybe. But it would screw with / break everything they’re already doing (which is objectively better), and it STILL would never look as good as a properly functioning EDR capable display.

Some Windows laptop manufacturers have implemented their own versions of EDR, but even then, it will only work on the laptop’s internal display — any external display will function exactly as it does on MacOS, albeit with the added bonus of that SDR brightness slider.

Given what I now know about this and all the work they have put into EDR, I kinda wish they would just license the tech to other display manufacturers — even if they restrict it for use on MacOS only.

But, then I suppose no one would buy a Pro Display XDR, if you could get a similar experience (but not exactly) for less than half the price 🤷‍♂️

So, what can Apple do to “fix” this and allow a better HDR experience on third party displays?

Really the only thing would be to add a SDR brightness slider for third party displays when HDR is enabled, as Microsoft does, however I honestly don’t think they will do this. If you’re going to bug Apple about it, that would probably be the thing to ask of them.

Until then, really all you can do is what Asus etc suggest — keep HDR turned off unless you actually want to consume or edit hdr content, and make the content full screen. It sucks, but that’s what it is.
 
Last edited:
  • Like
Reactions: alfogator

okkibs

macrumors 65816
Sep 17, 2022
1,070
1,005
What MS does with the brightness slider is the right way to do it - giving the customer a choice. Apple could allow an optional brightness setting to switch from "Apple recommended" mode into a custom mode.

There are some wrong assumptions here, OP's 15" MBP 2018 is not HDR capable. It is merely capable to play HDR media in SDR mode because it is capable of 10-bit color depth processing.

The actual hardware has no 10-bit capability, neither the cheaper temporal dithering 10-bit that the displays in the 14" and 16" MBP use (which is what you'll want for watching HDR content in HDR mode without breaking the bank) nor the real 10-bit that Apple's XDR display has (one reason for its price tag).

So the 10-bit processing support merely means that this Macbook can play HDR media. It cannot show that HDR content in HDR mode as it lacks any and all display hardware for it, such as the proper backlight tech (local dimming, or better OLED where each pixel can be turned on and off individually to avoid blooming).

Went to the Apple store today, they had a 2023 Mac mini M2 hooked up to their Studio Display... wanted to see if I could duplicate my issue on that, but alas, unlike ALL these other monitors I've tried there is NO "HDR" toggle for on or off - seems to always be on –
It has no toggle because it has no HDR, same as on your own Mac. This display only has SDR mode. If you check Apple's website they specifically advertise the XDR display with HDR support but they make no mention of it for the Studio Display, and so there can be no toggle for a functionality that it doesn't have.

If you have a XDR display or 14" or 16" MBP with the miniLED screen then you actually get a HDR toggle in MacOS:
Screenshot 2023-11-26 at 15.52.50.png


But as you noticed Apple treats their own displays differently. The HDR mode just works and you never need to disable it. It proves that Apple is absolutely capable of handling HDR displays correctly and just chose not to allow any manual brightness slider for third party displays.

I am actually setting up an Alienware AW3423 soon that has proper HDR support but is of course not from Apple, so then I can see for myself how bad it really is. But honestly everything on the market right now is either not that good (like this Alienware) or expensive. The XDR display has amazing HDR but the price tag is not for everyone, and even that amazing OLED panel in the Alienware is merely a 1440p panel that barely reaches 1000 nits for small highlights whereas Apple's HDR on their monitors can do 1000 nits across the entire panel and even more for highlights and offers a better resolution. And that OLED panel has a bad sub-pixel layout that makes it unsuitable for most serious work.

I use a TV just for watching TV so there I'd just turn HDR on for HDR content and off for all other content. If you use a TV as a computer screen replacement you'll be unhappy with that for sure with how MacOS handles HDR then. I thought of doing that when the LG C1 released but now I am glad I skipped that as the panel is mediocre and with 40"+ TV panels I don't expect to get panels that have the high quality tech I expect from a desktop monitor.
 

Dochartaigh

macrumors member
Original poster
Feb 3, 2023
88
33
I still don't understand, what do you guys want the system to do? What's the correct way for you?

If your monitor does X nits of max brightness, in SDR mode obviously X can be your UI white, because that's the whitest brightest color you'll ever need to display. On the other hand, in HDR mode, X now has to become the brightest HDR highlight, and your UI white must go down, because perception-wise it doesn't make sense to make UI white as bright as the sun.
It's simple. We want non-Apple HDR-compatible external monitors to work JUST LIKE MAC'S OWN HDR-Compatible screens. That's it.

On a Mac you can seamlessly switch from SDR to HDR content to your hearts content WITHOUT having to turn a HDR setting on/off every time – and everything looks proper. Windows has done this for years now (hell, I can run Windows Bootcamp on a Mac and HDR works right!).

Right now, on a NON-apple, but HDR-compatible screen, when you turn on HDR every single SDR-program (which is pretty much all of them, non-video players that is) are all screwed up, super dim, and imo unusable (white literally turns gray in SDR programs!).

Apple has made this work properly on their own devices, but nothing else, and there is NO reason for it. Everything from pretty old MacBook Pro laptops with a measly 500 nits of brightness, to their flagship $5K Pro Display XDR with 1,600 nits work with both HDR and SDR content at the same time just fine. There is NO reason why they can't make this work on non-Apple external displays. None.



But in order to do what they do, they need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness of the display, and not just the display as a whole but each individual dimming zone in the case of their FALD enabled screens.
The fact of the matter here is that there are no real industry standards for doing any of this
Thank you for your post - and you've hit upon a ton of things I previously mentioned, and we agree upon... but I'm sorry to say your logic here is a bit flawed...

You say there is NO industry standard for HDR - which is totally correct and accurate... but because there is NO standard, that means that Apple's own displays can NEVER, EVER be (even close to) 100% accurate on all the different flavors of HDR content different services/content use... they're basically forced to take their best educated guess at all of this...

SO, just like they guess at their own hardware, they can guess with other HDR-compatible monitors too (where a slider, like was mentioned Windows uses, would be welcome). But they don't... and to my disgust they LIE about it. They tell me everything is working perfectly as-it-should on many, many, different HDR-compatible external displays I've tested, on 3x different HDR-compatible Mac's (with numerous different high bandwidth cables). I even sent a detailed video to their senior people, with an industry-standard CALIBRATED colorimeter showing them the exact issue... again, everything is somehow working "as it should" (that SDR's white content should literally be gray instead of white when in HDR mode lol). They should be ashamed of themselves.
 

srgz

macrumors regular
Aug 22, 2010
134
82

Screenshot 2023-11-26 at 6.03.05 PM.png


okkibs -- That is not an HDR on/off toggle. That's a color management preset that includes support for HDR brightness / luminance profiles. It's completely different. They have essentially combined managed color with managed luminance (on their own screens), which again is the better way of doing it, rather than having an actual "on/off" toggle like on non- Apple displays.​

 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.