And on the 2017 iMacs with their 500 nits, is it still dithered?
500 nits is not sufficient to meet the full HDR specification for LCDs. You need about 1000 nits.
OLEDs are considered HDR with just 500 nits but that’s because their black levels are much, much lower than LCD.
[doublepost=1527369665][/doublepost]BTW, while Apple states "one billion colours" which might lead you to believe it is a 10-bit panel, the specs for the LG UltraFine 5K USB-C panel give it away. It states it is a "10bit(8bit + A-FRC)" panel.
What this is like a fake 10-bit. 8-bit with A-FRC basically fakes colours by quickly flashing two different colours for the appropriate amount of time effectively combining them to simulate an in-between colour. This is better than 8-bit, but obviously not real 10-bit.
Not quite the same thing but it reminds me of the scenario of what some people call "FauxK" in projectors. True 4K projectors are insanely expensive, and 1080p projectors are reasonably priced. To keep costs down while jumping on the 4K bandwagon, some companies are now producing projectors that use 1080p panels, but then quickly pixel shift the panel slightly to paint a sort of 4K image.
The difference here between a FauxK projector vs a 1080p projector is that a FauxK projector will understand the 4K video stream coming in and will pixel shift its 1080p panel to simulate that 4K, whereas the 1080p projector won't even understand the 4K signal at all and can only ever produce a regular 1080p image. The FauxK projector if good should look better than 1080p, but won't look as good as true 4K.
Similarly, an 8-bit with A-FRC panel like the LG UltraFine 5K (and iMac 5K) will understand a 10-bit signal and will simulate colours that look better than a traditional 8-bit panel, but it won't be the same as a true 10-bit panel.
So I guess some may suggest the answer to the question, "Is the iMac 5K's screen a 10-bit or an 8-bit panel?" is "Yes".
