I'm also confused but I just read quite a few articles about this and I'll try to explain what I learned.
- 10-bit or 8+2 AFRC bit is a requirement for the HDR10 standard, along with other specs (
https://en.wikipedia.org/wiki/High-dynamic-range_video#HDR10). So a monitor can have 10-bits of colour depth without being HDR10 compliant but an HDR10 monitor must have at least 10-bit of colour depth.
This is a matter of the decoding.
I don't think this applies to an iMac.
- Being HDR10 compliant allows the monitor to display a higher contrast between the darkest point and the brightest point on screen simultaneously.
There is a difference between being truly HDR10 compliant vs. being able to decode HDR10 content.
I don't think the new LG monitors can achieve the nits necessary for full HDR10 compliance.
There are lots of displays out there that advertise HDR10 compatibility, but which do not actually meet the full spec requirements for HDR10 compliance. But that's OK, because they still often look quite decent.
- It can be used by different medias, some of which are protected by HDCP 2.2 which the iMac is not compliant with since the 2017 version, thanks to the Kaby Lake iGPU.
Yes. Aside from the MacBook Air, any 2017 Mac (even the MacBook) supports this already.
The iMac Pro has no iGPU but it appears Apple has implemented it on AMD Polaris for this model.
Interestingly, Apple chose to wait on the MacBook Y series chips many months after Kaby Lake Y came out before releasing a new MacBook. At first I didn't know why but then I realized it was because Apple waited for two things, both implemented in a new series of Kaby Lake Y chips:
1) A faster m3
2) HDCP 2.2 compliance across the line
The first set of Kaby Lake Y m3 chips were slow, and the first set of Kaby Lake Y didn't support HDCP 2.2 either. That didn't come until spring 2017, even though Kaby Lake Y came out in 2016.
This is one reason I believe Apple will likely eventually offer 4K streaming on 2017 Macs.
The good news is that my lowly 2017 12" MacBook Core m3 an play back some complex 4K videos with under 25% usage that a top-of-the-line 2015 iMac Core i7 6700K can't at 100% usage, using QuickTime, so it's clear Apple has already begun the implementation of this standard.
- Yet, currently even non-protected HDR content such as illegally downloaded movies can't be displayed correctly on a 2017 iMac. Users report that the colours look dull and weird. I'm not sure if this is just a software limitation on macOS or if some specs of HDR10 are not met by the current iMac. Maybe someone with a better understanding of the HDR10 specs can shine some light on this. Can you play non-protected HDR content on a current iMac running Windows?
Yes, according to reports, you can play non-protected HDR10 content with proper colours AND you can play protected 4K content on a current iMac running Windows.
I was hoping to see this support in 10.14 at WWDC. We didn't get that, but I am still keeping my fingers crossed for the fall or in the new year with Mojave.
- Part of the HDR feature of the new LG monitors mentioned above is a feature called "HDR Effect" which modifies SDR content to make it look more like HDR content in real time. I haven't tried this but it sounds a little gimmicky. I doubt this will come to the iMac. (
https://www.lg.com/ca_en/desktop-monitors/lg-27UK850-W-4k-uhd-led-monitor)
More important is the ability (for the MacBook) to convert HDR to SDR IMO. I have this setup in my home theatre. My projector is 1080p SDR, but I have a 4K UHD Blu-ray player to gain access to Dolby Atmos, since some studios refuse to release Dolby Atmos tracks on regular 1080p Blu-ray discs, even though it's already supported. I initially had a Philips player that was a decent player but did no HDR to SDR conversion, and yes indeed it looked awful. I then bought a Panasonic which has the same base engine it seems (as the Philips seems to be a rebranded Panasonic), but adds processing options to convert HDR to SDR on the fly with good results.
BTW, ironically, while UHD Blu-ray works fine sent to a 1080p projector, Netflix does not. 4K Netflix isn't an option but even 1080p Netflix doesn't work because HDCP 2.2 is missing, even though HDCP 2.2 is not required for 1080p. So, I went out and bought a 4K HDMI splitter which also strips the HDCP requirement and now I get my Netflix 1080p out of that player too.
1080p Netflix from a 1080p Blu-ray player works fine without HDCP 2.2. Why Netflix would program their app on 4K players to not work at all at any resolution (even 1080p) without HDCP 2.2 is beyond me. Lazy programming?
I'm no expert and this post is the result of a few minutes of reading so please feel free to correct me if I'm wrong!
Ditto. I'm no expert either. But these are the main points I think are true:
1) The existing iMacs are already pseudo 10-bit (8-bit plus FRC) but do not have the max brightness to be considered truly HDR10 compliant from that standpoint.
2) However, I don't think the newest LG monitors you listed meet those specs either.
3) All 2017 Macs except for the MacBook Air already support HDCP 2.2.
4) Hardware 4K HDR acceleration works well performance-wise on all 2017 Macs (except the MacBook Air), but we are still waiting to get the colours displayed correctly.
5) 4K HDR with proper colours and 4K DRM support are now waiting for the proper software support on macOS, as they already work on Windows on the iMac.