Maybe you have the 15” model and it’s different from the 13”
[doublepost=1542109396][/doublepost]And here’s me telling you it isn’t possible. Either it’s sending 8-bit color or 4:2:2, but HDMI can’t send 10-bit RGB at that resolution:
http://community.cedia.net/blogs/david-meyer/2017/06/22/4k-60-444-hdr
FFS. I posted the fricking screenshots of my HDMI connection with 4K monitor at 10 bit 60hz.
macOS won't report 10 bit if it is running at 8 bit. We know that because the MBP internal screen reports 8 bit but the external 10 bit screen reports 10bit.
We been outputting ARGB101010 4K60 for nearly two years from MacBook Pros. The most important point is to get correct adapters and cables.
You don't even understand UI scaling in macOS. You are a complete waste of our time. You ACTUALLY thought I was running a 5K screen at 1440p because you don't understand the UI scaling options. That is so insane I can't stop laughing.
You also state we have to wait for HDMI 2.1 to get 4K60 in 10 bit. That’s wrong too. We already got that with HDMI 2.0b.
https://www.digitaltrends.com/home-theater/what-is-hdmi-2-0/
‘HDMI 1.4 supported 4K resolutions, yes, but only at 24 or 30 frames per second. That works fine for movies, but isn’t useful for gaming and many TV broadcasts, which require 50 or 60 fps. Also, HDMI 1.4 limited 4K Ultra HD content to 8-bit color, though it is capable of 10- or 12-bit color. HDMI 2.0 fixed all of that because it could handle up to 18 gigabits per second — plenty enough to allow for 12-bit color and video up to 60 frames per second.’
Enough of your nonsense, please.
Last edited: