Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Jul 4, 2015
4,487
2,551
Paris
Maybe you have the 15” model and it’s different from the 13”
[doublepost=1542109396][/doublepost]And here’s me telling you it isn’t possible. Either it’s sending 8-bit color or 4:2:2, but HDMI can’t send 10-bit RGB at that resolution:
http://community.cedia.net/blogs/david-meyer/2017/06/22/4k-60-444-hdr

FFS. I posted the fricking screenshots of my HDMI connection with 4K monitor at 10 bit 60hz.

macOS won't report 10 bit if it is running at 8 bit. We know that because the MBP internal screen reports 8 bit but the external 10 bit screen reports 10bit.

We been outputting ARGB101010 4K60 for nearly two years from MacBook Pros. The most important point is to get correct adapters and cables.

You don't even understand UI scaling in macOS. You are a complete waste of our time. You ACTUALLY thought I was running a 5K screen at 1440p because you don't understand the UI scaling options. That is so insane I can't stop laughing.

You also state we have to wait for HDMI 2.1 to get 4K60 in 10 bit. That’s wrong too. We already got that with HDMI 2.0b.

https://www.digitaltrends.com/home-theater/what-is-hdmi-2-0/

‘HDMI 1.4 supported 4K resolutions, yes, but only at 24 or 30 frames per second. That works fine for movies, but isn’t useful for gaming and many TV broadcasts, which require 50 or 60 fps. Also, HDMI 1.4 limited 4K Ultra HD content to 8-bit color, though it is capable of 10- or 12-bit color. HDMI 2.0 fixed all of that because it could handle up to 18 gigabits per second — plenty enough to allow for 12-bit color and video up to 60 frames per second.’


Enough of your nonsense, please.
 
Last edited:
  • Like
Reactions: macagain

Alameda

macrumors 65816
Original poster
Jun 22, 2012
1,257
860
FFS. I posted the fricking screenshots of my HDMI connection with 4K monitor at 10 bit 60hz.

macOS won't report 10 bit if it is running at 8 bit. We know that because the MBP internal screen reports 8 bit but the external 10 bit screen reports 10bit.

We been outputting ARGB101010 4K60 for nearly two years from MacBook Pros. The most important point is to get correct adapters and cables.

You don't even understand UI scaling in macOS. You are a complete waste of our time. You ACTUALLY thought I was running a 5K screen at 1440p because you don't understand the UI scaling options. That is so insane I can't stop laughing.

You also state we have to wait for HDMI 2.1 to get 4K60 in 10 bit. That’s wrong too. We already got that with HDMI 2.0b.

https://www.digitaltrends.com/home-theater/what-is-hdmi-2-0/

‘HDMI 1.4 supported 4K resolutions, yes, but only at 24 or 30 frames per second. That works fine for movies, but isn’t useful for gaming and many TV broadcasts, which require 50 or 60 fps. Also, HDMI 1.4 limited 4K Ultra HD content to 8-bit color, though it is capable of 10- or 12-bit color. HDMI 2.0 fixed all of that because it could handle up to 18 gigabits per second — plenty enough to allow for 12-bit color and video up to 60 frames per second.’


Enough of your nonsense, please.
I already showed you the math.
HDMI 2.0 can only deliver 4K60 10/12-bit if you sub-sample to 4:2:2, because 4:2:2 8/10/12-bit 4:2:2 has the same bandwidth as 8-big RGB. 4:2:2 is fine for video, because MPEG for DVD/Broadcast and H.264/5 for BD sub-sample to 4:2:0 before any encoding begins. But computer graphics are RGB. And HDMI 2.0 can’t deliver 4K60 with 10-bit RGB, not even if you reduce all the blanking periods to zero (which isn’t possible). Look at the math. It’s very clear.
 
Last edited:

Alameda

macrumors 65816
Original poster
Jun 22, 2012
1,257
860
FFS. I posted the fricking screenshots of my HDMI connection with 4K monitor at 10 bit 60hz.
And whatever that screenshot is telling you is wrong.

Educate yourself and read about what HDMI can do. HDMI 2.0 cannot drive 22 Gbits/second. Your display might be a 10-bit display, but it isn’t receiving that from its HDMI input, because it can’t.

*** I showed you the math ***
If that’s not good enough for you, buy the spec and read it yourself.
 
Last edited:
Jul 4, 2015
4,487
2,551
Paris
And whatever that screenshot is telling you is wrong.

Educate yourself and read about what HDMI can do. HDMI 2.0 cannot drive 22 Gbits/second. Your display might be a 10-bit display, but it isn’t receiving that from its HDMI input, because it can’t.

*** I showed you the math ***
If that’s not good enough for you, buy the spec and read it yourself.

Latest HDMI 2.0 spec includes 4K60 10 bit that HDR requires.

https://www.trustedreviews.com/opinion/hdmi-2-0-vs-1-4-2913356

From the Wikipedia entry:

‘HDMI 2.0a was released on April 8, 2015, and added support for High Dynamic Range (HDR) video with static metadata.

HDMI 2.0b was released March, 2016.[117] HDMI 2.0b initially supported the same HDR10 standard as HDMI 2.0a as specified in the CTA-861.3 specification.[115] In December 2016 additional support for HDR Video transport was added to HDMI 2.0b.

macOS won’t show 10 bit color output in the System Profiler if it isn’t outputting 10 bit. We have confirmed that on several forums hundreds of times, hundreds.

We on the Mac Pro forums have been doing it for two years. On this MacBook Pro forum we been doing it for nearly two years. We test so many monitor and GPU configurations ever. We tested images and HDR videos.

We verify, report and confirm.

You the guy who didn’t know about UI scaling and won’t accept any evidence. Use the correct adapters - 2.0a or 2.0b, active adapter, high quality, no cheap knock off. Works on any TB3 port.

The claims you make in your first post are false.
 
Last edited by a moderator:

Alameda

macrumors 65816
Original poster
Jun 22, 2012
1,257
860
Y
https://www.trustedreviews.com/opinion/hdmi-2-0-vs-1-4-2913356

From the Wikipedia entry:

‘HDMI 2.0a was released on April 8, 2015, and added support for High Dynamic Range (HDR) video with static metadata.

HDMI 2.0b was released March, 2016.[117] HDMI 2.0b initially supported the same HDR10 standard as HDMI 2.0a as specified in the CTA-861.3 specification.[115] In December 2016 additional support for HDR Video transport was added to HDMI 2.0b.
You don’t understand.
HDMI 1.4 could transmit 4K30 8-bit RGB
That means HDMI 1.4 could also transmit 4K30 8/10/12-but 4:2:2 YCbCr.
HDMI 2.0 doubles this: 4K60 8-bit RGB, or 4K60 8/10/12-bit YCbCr.
Since blu-ray and broadcast TV content are all 4:2:0, HDMI 2.0 has support for 10-bit HDR.
HDMI 2.0a just adds the permission to send an InfoFrame (a packet, really) which indicates that the video is in PQ, aka HDR10 Mode. But some European broadcasters wanted to use an HDR model called Hybrid Log Gamma, so HDMI 2.0b added permission to send an HLG InfoFrame.

If Apple’s System profiler indicates that the HDMI interface is sending 4K60 10-bit RGB, it is an error. It can’t transmit that much data over the HDMI cable, nor can the display receive it.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.