A lot of discussion exists on getting 4K @ 60 or 120Hz with 10-bit HDR and 4:4:4 chroma subsampling (YCbCr444) with an external monitor, but seemingly mostly on the M1 forums. And even then there's little discussion about how to actually get it to work beyond the discussion of the theoretical hardware and cable support.
I have both a 16" MacBook Pro and an M1 Mac Mini. I'd be content getting either one of them to work properly. My current monitor is the LG 48CX OLED TV, which has HDMI 2.1 inputs and supports 4K@120Hz w/ 10-bit HDR. I use a Cable Matters 48Gbps Thunderbolt3 to HDMI 2.1 adapter. This same setup works flawlessly with my work PC laptop with Thunderbolt3 outputs. Just connect everything up and it works. When did Windows become the platform that stuff just works?
My 16" MacBook Pro is a different story. By default I get 4K@60Hz with YCbCr420. Diagnostics on the LG TV reports it as "YCbCr 4:2:0 10-bit TM". On the Mac, there's no way to change anything in Display Settings (not the resolution, the refresh rate, the color depth, or HDR), but the scaling. The Color Profile used is "LG TV SSCR".
Is there some sort of plist hack, EDID override, or something that I need to do even to get full YCbCr444? Before I even consider 120Hz, getting the color correct is the first order of business. Do I need to mess with SwitchResX in order to get YCbCr 4:4:4 and HDR? I do have deep color enabled for that input on my TV. I'm at a loss here and confounded by why Apple makes this so difficult. The Apple TV seemingly has better display support. Thank you for any suggestions!
UPDATE: I changed the HDMI input to be labeled as PC in the TV's settings, based on further reading, but this didn't offer any benefit.
I have both a 16" MacBook Pro and an M1 Mac Mini. I'd be content getting either one of them to work properly. My current monitor is the LG 48CX OLED TV, which has HDMI 2.1 inputs and supports 4K@120Hz w/ 10-bit HDR. I use a Cable Matters 48Gbps Thunderbolt3 to HDMI 2.1 adapter. This same setup works flawlessly with my work PC laptop with Thunderbolt3 outputs. Just connect everything up and it works. When did Windows become the platform that stuff just works?
My 16" MacBook Pro is a different story. By default I get 4K@60Hz with YCbCr420. Diagnostics on the LG TV reports it as "YCbCr 4:2:0 10-bit TM". On the Mac, there's no way to change anything in Display Settings (not the resolution, the refresh rate, the color depth, or HDR), but the scaling. The Color Profile used is "LG TV SSCR".
Is there some sort of plist hack, EDID override, or something that I need to do even to get full YCbCr444? Before I even consider 120Hz, getting the color correct is the first order of business. Do I need to mess with SwitchResX in order to get YCbCr 4:4:4 and HDR? I do have deep color enabled for that input on my TV. I'm at a loss here and confounded by why Apple makes this so difficult. The Apple TV seemingly has better display support. Thank you for any suggestions!
UPDATE: I changed the HDMI input to be labeled as PC in the TV's settings, based on further reading, but this didn't offer any benefit.
Last edited: