Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

walexago

macrumors newbie
Original poster
Jan 14, 2018
18
1
Hi,

sometimes , when my TV is busy (wife and/or children), I use my iMac (2011) to play some HD movies: now with 4k contents and HDR/Dolby Vision, iMac is suitable to play these contents?

And what model?

Thanks in advance
 
Afaik there isn't video or game HDR support in OSX at the moment. This could be introduced with the next OSX version possibly. I know Windows 10 still has issues with its HDR implementation, so it's early days. LG are releasing HDR600 certified monitors with their new Nano IPS technology in May, and hopefully Apple will use a Nano IPS screen for the next iMac model.
 
Probably 2-3 years away as of yet on the Mac...
My guess is 2018 with the release of 10.14, in Kaby Lake Macs or later, and the iMac Pro.

No, the iMac won't have true HDR, but it does have a wide gamut screen. All it really needs now is OS support, since the h.265 hardware decode support and the hardware 4K DRM support is already built into the machine. That OS support doesn't yet exist, but could very well appear in 2018 with 10.14. It already exists in Windows 10 PCs.
 
  • Like
Reactions: Internet Enzyme
I ran through this and ended up buying a cheap Samsung 4K HDR TV for my home office. Much cheaper and better picture, then I purchased an Apple TV 4K. Works great for me. DRM on 4K content is a major block to streaming it on any computer.
 
As noted, the current iMac displays do not support HDR, though they do support Wide Color Gamut.

LG, Samsung and Asus are all starting to release 4K and 5K displays that support HDR10, but none support Dolby Vision.

iTunes does support 4K streaming on macOS so it stands to reason that once the iMac adopts HDR displays (and if Apple releases external displays with HDR) this functionality will be enabled in macOS (though probably just HDR10 and not Dolby Vision though that could be a differentiator for Apple).
 
The 5K panels are 10 bits so in principle it should only be a matter of software, right?
 
  • Like
Reactions: EugW
The 5K panels are 10 bits so in principle it should only be a matter of software, right?
They are not 10-bit.

The OS can handle 10-bit, but the 5K panels are WCG 8-bit screens, so 10-bit is dithered to that. Doesn't matter though, since Apple can still support HDR iTunes and HDR Netflix, just dithered for the iMac 5K. I am hoping to see such DRM'd 4K HDR on June 4 with macOS 10.14.

However, even if Apple does implement this, it is likely it would require a 2017 Mac or later (not including the 2017 MacBook Air).
 
Last edited:
  • Like
Reactions: pier
They are not 10-bit.

The OS can handle 10-bit, but the 5K panels are WCG 8-bit screens, so 10-bit is dithered to that. Doesn't matter though, since Apple can still support HDR iTunes and HDR Netflix, just dithered for the iMac 5K. I am hoping to see such DRM'd 4K HDR on June 4 with macOS 10.14.

However, even if Apple does implement this, it is likely it would require a 2017 Mac or later (not including the 2017 MacBook Air).

So what is this "enable HDR" option that appeared on some Macs in this thread? Is it really "enable dithered HDR"?

https://forums.macrumors.com/threads/imac-5k-enable-hdr-option.1927091/
 
And on the 2017 iMacs with their 500 nits, is it still dithered?
500 nits is not sufficient to meet the full HDR specification for LCDs. You need about 1000 nits.

OLEDs are considered HDR with just 500 nits but that’s because their black levels are much, much lower than LCD.
[doublepost=1527369665][/doublepost]BTW, while Apple states "one billion colours" which might lead you to believe it is a 10-bit panel, the specs for the LG UltraFine 5K USB-C panel give it away. It states it is a "10bit(8bit + A-FRC)" panel.

What this is like a fake 10-bit. 8-bit with A-FRC basically fakes colours by quickly flashing two different colours for the appropriate amount of time effectively combining them to simulate an in-between colour. This is better than 8-bit, but obviously not real 10-bit.

Not quite the same thing but it reminds me of the scenario of what some people call "FauxK" in projectors. True 4K projectors are insanely expensive, and 1080p projectors are reasonably priced. To keep costs down while jumping on the 4K bandwagon, some companies are now producing projectors that use 1080p panels, but then quickly pixel shift the panel slightly to paint a sort of 4K image.

The difference here between a FauxK projector vs a 1080p projector is that a FauxK projector will understand the 4K video stream coming in and will pixel shift its 1080p panel to simulate that 4K, whereas the 1080p projector won't even understand the 4K signal at all and can only ever produce a regular 1080p image. The FauxK projector if good should look better than 1080p, but won't look as good as true 4K.

Similarly, an 8-bit with A-FRC panel like the LG UltraFine 5K (and iMac 5K) will understand a 10-bit signal and will simulate colours that look better than a traditional 8-bit panel, but it won't be the same as a true 10-bit panel.

So I guess some may suggest the answer to the question, "Is the iMac 5K's screen a 10-bit or an 8-bit panel?" is "Yes". ;)
 
Last edited:
I recently found a Dell Monitor, UP2718Q: seems to me a 4k HDR10 (true 10bit) with DP1.4.

What do you think?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.