Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

onlinespending

macrumors regular
Original poster
Jul 17, 2014
103
187
A lot of discussion exists on getting 4K @ 60 or 120Hz with 10-bit HDR and 4:4:4 chroma subsampling (YCbCr444) with an external monitor, but seemingly mostly on the M1 forums. And even then there's little discussion about how to actually get it to work beyond the discussion of the theoretical hardware and cable support.

I have both a 16" MacBook Pro and an M1 Mac Mini. I'd be content getting either one of them to work properly. My current monitor is the LG 48CX OLED TV, which has HDMI 2.1 inputs and supports 4K@120Hz w/ 10-bit HDR. I use a Cable Matters 48Gbps Thunderbolt3 to HDMI 2.1 adapter. This same setup works flawlessly with my work PC laptop with Thunderbolt3 outputs. Just connect everything up and it works. When did Windows become the platform that stuff just works?

My 16" MacBook Pro is a different story. By default I get 4K@60Hz with YCbCr420. Diagnostics on the LG TV reports it as "YCbCr 4:2:0 10-bit TM". On the Mac, there's no way to change anything in Display Settings (not the resolution, the refresh rate, the color depth, or HDR), but the scaling. The Color Profile used is "LG TV SSCR".

Is there some sort of plist hack, EDID override, or something that I need to do even to get full YCbCr444? Before I even consider 120Hz, getting the color correct is the first order of business. Do I need to mess with SwitchResX in order to get YCbCr 4:4:4 and HDR? I do have deep color enabled for that input on my TV. I'm at a loss here and confounded by why Apple makes this so difficult. The Apple TV seemingly has better display support. Thank you for any suggestions!

UPDATE: I changed the HDMI input to be labeled as PC in the TV's settings, based on further reading, but this didn't offer any benefit.

Display Settings.jpg
 
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: Yurk and joanbcn91
Hi! I have 48CX + Mac Mini M1.
  • With HDMI - HDMI the output on Freesync information indicates YCBCR 4:4:4 8b TM.
  • With the Apple A2119 Adapter, HDMI - USB C: the output indicates RGB 8b TM.
with the Apple adapter I have achieved to show RGB output , great! but it is still in 8 bits "TM" ? But not 10b.

here i posted some tests: https://forums.macrumors.com/thread...r-mode-with-mac-mini-m1.2272978/post-29557494
 
  • Like
Reactions: ElectronGuru
A lot of discussion exists on getting 4K @ 60 or 120Hz with 10-bit HDR and 4:4:4 chroma subsampling (YCbCr444) with an external monitor, but seemingly mostly on the M1 forums. And even then there's little discussion about how to actually get it to work beyond the discussion of the theoretical hardware and cable support.

I have both a 16" MacBook Pro and an M1 Mac Mini. I'd be content getting either one of them to work properly. My current monitor is the LG 48CX OLED TV, which has HDMI 2.1 inputs and supports 4K@120Hz w/ 10-bit HDR. I use a Cable Matters 48Gbps Thunderbolt3 to HDMI 2.1 adapter. This same setup works flawlessly with my work PC laptop with Thunderbolt3 outputs. Just connect everything up and it works. When did Windows become the platform that stuff just works?

My 16" MacBook Pro is a different story. By default I get 4K@60Hz with YCbCr420. Diagnostics on the LG TV reports it as "YCbCr 4:2:0 10-bit TM". On the Mac, there's no way to change anything in Display Settings (not the resolution, the refresh rate, the color depth, or HDR), but the scaling. The Color Profile used is "LG TV SSCR".

Is there some sort of plist hack, EDID override, or something that I need to do even to get full YCbCr444? Before I even consider 120Hz, getting the color correct is the first order of business. Do I need to mess with SwitchResX in order to get YCbCr 4:4:4 and HDR? I do have deep color enabled for that input on my TV. I'm at a loss here and confounded by why Apple makes this so difficult. The Apple TV seemingly has better display support. Thank you for any suggestions!

UPDATE: I changed the HDMI input to be labeled as PC in the TV's settings, based on further reading, but this didn't offer any benefit.

View attachment 1722157
4K 120Hz possibilities (assuming 1188MHz):
1) HDMI 2.0: 420 8bpc
2) HBR3: 422 10bpc
3) HDMI 2.1: 444 10bpc
4) HBR3 with DSC: 444 12bpc
5) HDMI 2.1 with DSC: 444 16bpc

I would forget the M1 Mac. EDID overrides don't work except if you just want to change the display name.

If DSC is not working (may be a problem with Big Sur - did you try Catalina?), you should still be able to get 422 10bpc or 420 10bpc. You may need to edit the EDID to get that. SwitchResX does not have features to override the color info, so you'll have to do it using a different utility.
I've got a script at https://gist.github.com/joevt/32e5efffe3459958759fb702579b9529 to help examine and modify EDIDs.

First, get the connection and EDID info of each port of the display (and also any settings that might modify the EDID) using this command (do it in Catalina and Big Sur; change the file name to describe what port and/or setting of the TV it is for):
/System/Library/Extensions/AppleGraphicsControl.kext/Contents/MacOS/AGDCDiagnose -a > AGDCDiagnose_a_port1.txt 2>&1
The files will show if DSC is enabled or not. The files can be loaded by EDIDUtil.sh script using this command loadagdcfile AGDCDiagnose*.txt. The listedids command will show all the EDIDs (duplicates are grouped together).
You can zip the AGDCDiagnose files and post them here so we can see what's going on.

Even if you get the right color, I don't know if HDR will work. I've only seen HDR get enabled with Apple adapters. Maybe there's an EDID override that can get that to work. Need to compare what the Apple adapter gives compared to the club-3d adapter.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
4K 120Hz possibilities (assuming 1188MHz):
1) HDMI 2.0: 420 8bpc
2) HBR3: 422 10bpc
3) HDMI 2.1: 444 10bpc
4) HBR3 with DSC: 444 12bpc
5) HDMI 2.1 with DSC: 444 16bpc

I would forget the M1 Mac. EDID overrides don't work except if you just want to change the display name.

If DSC is not working (may be a problem with Big Sur - did you try Catalina?), you should still be able to get 422 10bpc or 420 10bpc. You may need to edit the EDID to get that. SwitchResX does not have features to override the color info, so you'll have to do it using a different utility.
I've got a script at https://gist.github.com/joevt/32e5efffe3459958759fb702579b9529 to help examine and modify EDIDs.

First, get the connection and EDID info of each port of the display (and also any settings that might modify the EDID) using this command (do it in Catalina and Big Sur; change the file name to describe what port and/or setting of the TV it is for):
/System/Library/Extensions/AppleGraphicsControl.kext/Contents/MacOS/AGDCDiagnose -a > AGDCDiagnose_a_port1.txt 2>&1
The files will show if DSC is enabled or not. The files can be loaded by EDIDUtil.sh script using this command loadagdcfile AGDCDiagnose*.txt. The listedids command will show all the EDIDs (duplicates are grouped together).
You can zip the AGDCDiagnose files and post them here so we can see what's going on.

Even if you get the right color, I don't know if HDR will work. I've only seen HDR get enabled with Apple adapters. Maybe there's an EDID override that can get that to work. Need to compare what the Apple adapter gives compared to the club-3d adapter.

Thanks for the details on using that script. I'll give that a shot. As mentioned, I'm already getting YCbCr420 10-bit, which is crap. Would even take 444 8-bit, but given that others are claiming they can get even 4K@120Hz with the 16" MacBook Pro, I'm not sure I'd be totally satisfied with that
 
I have the exact same setup. Macbook Pro 16", LG OLED 48CX and that brand new 8k@30hz, 4k120hz Cable Matters 48Gbps Thunderbolt3 to HDMI 2.1 adapter. When I ordered the adapter there was no information on that the adapter could maximum support 4k60hz on Mac. I do also would like to utilize the hardware to it's fully potential and I can't believe there's such problems with connecting Macs to external monitors/TV. I'm willing to help to find a solution.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Hey so, I just got a 65" CX OLED that I've been using with my MacBook and googling the problem I've been experiencing lead me here.

I have a 16" macbook pro with the 5500M 8GB. I have the CX's input on PC mode and I'm using an HDMI 2.0 cable to connect the macbook to the TV via the Apple Multi A/V Adapter (the one that goes USBC to HDMI, USBA and USBC).

Text looks like garbage, and I'm pretty surprised and disappointed. Black text on white backgrounds looks fine and plenty crisp. Brightly colored text on white background (like green or blue that you'd have in a coding environment) looks absolutely terrible. There's a lighter coloured halo around the text that makes it look really blurry and hard to look at.

I've been lead to believe that the problem is that the macbook is sending a 4:2:0 signal to the TV, and this would be fixed if it were sending a 4:4:4 to the TV. Is this right? I'm very new to this kind of stuff, so I'm wondering why I haven't experienced this issue with other large TVs that I've used as monitors??
 
I've been lead to believe that the problem is that the macbook is sending a 4:2:0 signal to the TV, and this would be fixed if it were sending a 4:4:4 to the TV. Is this right? I'm very new to this kind of stuff, so I'm wondering why I haven't experienced this issue with other large TVs that I've used as monitors??
Probably. The AGDCDiagnose command will tell you what signal type it is sending.
Since you have an Intel Mac, you can override the EDID to remove 4:2:2 and 4:2:0 which forces 4:4:4.
 
A lot of discussion exists on getting 4K @ 60 or 120Hz with 10-bit HDR and 4:4:4 chroma subsampling (YCbCr444) with an external monitor, but seemingly mostly on the M1 forums. And even then there's little discussion about how to actually get it to work beyond the discussion of the theoretical hardware and cable support.

I have both a 16" MacBook Pro and an M1 Mac Mini. I'd be content getting either one of them to work properly. My current monitor is the LG 48CX OLED TV, which has HDMI 2.1 inputs and supports 4K@120Hz w/ 10-bit HDR. I use a Cable Matters 48Gbps Thunderbolt3 to HDMI 2.1 adapter. This same setup works flawlessly with my work PC laptop with Thunderbolt3 outputs. Just connect everything up and it works. When did Windows become the platform that stuff just works?

My 16" MacBook Pro is a different story. By default I get 4K@60Hz with YCbCr420. Diagnostics on the LG TV reports it as "YCbCr 4:2:0 10-bit TM". On the Mac, there's no way to change anything in Display Settings (not the resolution, the refresh rate, the color depth, or HDR), but the scaling. The Color Profile used is "LG TV SSCR".

Is there some sort of plist hack, EDID override, or something that I need to do even to get full YCbCr444? Before I even consider 120Hz, getting the color correct is the first order of business. Do I need to mess with SwitchResX in order to get YCbCr 4:4:4 and HDR? I do have deep color enabled for that input on my TV. I'm at a loss here and confounded by why Apple makes this so difficult. The Apple TV seemingly has better display support. Thank you for any suggestions!

UPDATE: I changed the HDMI input to be labeled as PC in the TV's settings, based on further reading, but this didn't offer any benefit.

View attachment 1722157
This man has posted the #1 reason I am waiting for the M1X MacBook Pro with an HDMI 2.1 output. No Mac unless it’s a Pro Display XDR which uses 2 display in cables to get around bandwidth limitations is capable out delivering
4K@60hz (or 120hz) in 10-bit color 4:4:4 it’s just not happening period.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
I was able to remove chroma subsampling on my Intel Macbook. I did it with SwitchResX and the Apple A2119 adapter. 4k 60Hz. Not 120Hz though
 
I was able to remove chroma subsampling on my Intel Macbook. I did it with SwitchResX and the Apple A2119 adapter. 4k 60Hz. Not 120Hz though
Apple's adapter supports input of two lanes of HBR3 and DSC. It outputs HDMI 2.0b. An HDMI 2.1 cable is not going get you better output (but it is probably a higher quality cable so you can buy it for future proofing).

4K 60Hz CVT-RB is under 540 MHz so it can be transmitted by two lanes of HBR3 without DSC using 8 bpc RGB/444. Any DisplayPort 1.4 capable GPU can do HBR3 (as long as its not coming from a Thunderbolt Alpine Ridge device).

4K 60Hz HDMI is 594 MHz so it cannot be transmitted by Apple's adapter using 8 bpc RGB/444 unless DSC is enabled, which it might not be in Big Sur or if your GPU is not AMD Navi or Ice Lake or M1.

HDR requires 10bpc so there's no way you're getting that with 4K 60Hz 444 from HDMI 2.0 (10 bpc 444 limits pixel clock to 480 MHz from HDMI 2.0).

If you have an Intel Mac, you can check pixel output format (bpc and chroma sub sampling) using the AGDCDiagnose command.
 
4K is generally hard. I don't know why but I wanted 3x4k support and I looked at a lot of video cards trying to figure out what would work well and the descriptions by the GPU makers are vague on what it can do. Cards may theoretically be able to do what you want to but not well. Or even not at all. Getting an iMac i7 with one of the high-end AMD cards will appear to do what I want to but getting one of those GPUs is difficult right now.

I hope that graphics improve with the M1X - a lot.
 
I wonder if HDR is possible in my case on LG 27UN83A with macOS/MBP16 2019

After connecting first time using the USB-C I did notice HDR icon on the LG and then the famous "washed colors" / poor contrast image. Also display preferences shows HDR checkbox - after unchecking HDR the colors are ok but of course no HDR so color stripes can easily be observed on a large area gradients.

I did browse the net for some half a day reading various options. I played with gist script supposed to fix the issue - https://gist.github.com/adaugherity/7435890 I just been able to disable the HDR with the script. After reading the LG manual I found one confusing info:

use the HDR function, set HDMI ULTRA HD Deep Color to "On".
and there is a table Color Depth / Chroma Sampling suggesting that:

YCbCr 4:2:0 & YCbCr 4:2:2 supported for both 8 bit and 10 bit

YCbCr 4:4:4 & RGB 4:4:4 - only under 8 bit (10 bit column empty in the same row)

May that imply the best HDR this monitor can run is YCbCr 4:2:2? If so I wonder if there is a way to make it working without washed colors on macOS (Just realized I am on a BigSur, will try to upgrade to Monterey ...)?
 
I wonder if HDR is possible in my case on LG 27UN83A with macOS/MBP16 2019

After connecting first time using the USB-C I did notice HDR icon on the LG and then the famous "washed colors" / poor contrast image. Also display preferences shows HDR checkbox - after unchecking HDR the colors are ok but of course no HDR so color stripes can easily be observed on a large area gradients.

I did browse the net for some half a day reading various options. I played with gist script supposed to fix the issue - https://gist.github.com/adaugherity/7435890 I just been able to disable the HDR with the script. After reading the LG manual I found one confusing info:


and there is a table Color Depth / Chroma Sampling suggesting that:

YCbCr 4:2:0 & YCbCr 4:2:2 supported for both 8 bit and 10 bit

YCbCr 4:4:4 & RGB 4:4:4 - only under 8 bit (10 bit column empty in the same row)

May that imply the best HDR this monitor can run is YCbCr 4:2:2? If so I wonder if there is a way to make it working without washed colors on macOS (Just realized I am on a BigSur, will try to upgrade to Monterey ...)?
A lower refresh rate (30Hz HDMI or 50Hz CVT-RB or 55Hz CVT-RB2) could allow 4:4:4 10bpc/HDR. I would test with Windows since it gives more control.
 
A lower refresh rate (30Hz HDMI or 50Hz CVT-RB or 55Hz CVT-RB2) could allow 4:4:4 10bpc/HDR. I would test with Windows since it gives more control.
As I’m new to the topic, does YCbCr 4:2:2 imply washed colors? Or is there a way to fix washed colors but it the may require extra OS support?

From the reviews I was reading - 4:4:4 is just not supported by this monitor.
 
As I’m new to the topic, does YCbCr 4:2:2 imply washed colors? Or is there a way to fix washed colors but it the may require extra OS support?

From the reviews I was reading - 4:4:4 is just not supported by this monitor.
4:2:2 or 4:2:0 does not imply washed colors. They imply blurry coloured text.
https://www.rtings.com/tv/learn/chroma-subsampling

HDR is causing the washed out colors. macOS is not adjusting the colors used by apps that don't do HDR when HDR is enabled. That's my guess. The washed out colors go away when you watch an HDR YouTube video.
 
Thanks, that makes sense. But I'm even more confused now :) I switched on the HDR - tried various HDR samples on youtube, also mp4 files with QT player, MKV - all are dimmed (I can easily compare the max white moments with the OSD menu).

As for the HDR - I bought the monitor mainly for coding so the key thing I'd expect on macOS from HDR is to avoid seeing color bands/stripes and I am surprised to see really great quality of some videos with no any bands (eg
) while noticeable bands on others (eg 54th second of Matrix trailer
). As after calibration the real image quality as well as all the calibration test patterns looks really great I'd probably stick to this monitor concerning very good value (and it provides both signal and power over usb-c). I may need to test it also with PS5...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.