Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Marswarrior462

macrumors 6502
Original poster
Sep 4, 2020
256
459
Calgary, AB, Canada
I have an iPhone 13 Pro Max. I’m impressed with how good the display is, with ProMotion, brightness, colour accuracy, etc. But I haven’t been able to find out what the colour depth is. Even those super detailed display reviews don’t mention anything about colour depth. So is the iPhone 13 Pro and Pro Max 8-bit or 10-bit? I’m assuming 8-bit cause Apple hasn’t said anything about colour depth but I hope it’s 10-bit. Can you help me out please?
 
  • Like
Reactions: Bishope1999
I honestly would like to know this as well. I have the 13 Pro Max and I buy a lot of 4K Blu-ray movies, and redeem a few codes when I don't give them away.

Testing Tenet, Jason Bourne and Jaws just to name a few. On the iPhone 13 Pro Max, it doesn't look to me as if the color range is as good as the 4K Blu-ray's. I have a Sony X950H and I have it properly calibrated to be as accurate as possible. It gives me the impression that the 13 Pro Max is an 8-bit panel and that could explain the color difference I am seeing.

It's also possible that the superior quality of a 4K Blu-ray disc is fastly super to the streaming services even if they support Dolby Vision HDR. But color depth isn't something that stands out as much when streaming those movies on my Sony TV.

It leads me to believe the iPhone 13 Pro Max has an 8-bit panel and not a 10-bit, but I could be wrong.
 
  • Like
Reactions: Schauspieler
I don’t think it is merely 8-bit color. Test results with the display showed 102% coverage of the DSI P3 color gamut, which indicates it is at least a little better than 8-bit (16.7 million colors). It might not be true 10-bit (1 billion colors) though. Apple is suspiciously short of calling it 10-bit. When Apple talks about HDR video they say the cameras can record even Dolby Vision HDR and that the iPhone 13 Pro Max can view it. But they never specifically say the display is actually 10-bit color depth. You can of course view Dolby Vision HDR on an 8-bit display, but it won’t look as good as when viewed on a 10-bit display. There might be a good reason why Apple carefully avoids using “10-bit” in their descriptions of the display. They call the display a Super Retina XDR display, which appears to be vague Apple-speak for an OLED display with “extended” dynamic range. What exactly does “XDR” really mean? Who knows. I do know that DCI P3 has 26% more color space than sRGB, and sRGB is 8-bit whereas DCI P3 can be up to 10-bit on a 10-bit display. It appears that a lot of the reviewers on the internet seem to think the iPhone 13 Pro Max has a 10-bit screen, but none I’ve read give any proof. Apple’s specs on their website simply don’t mention color depth. So again, why is Apple so careful in how they describe the display?

Having said all that, I do believe the iPhone 13 Pro Max display is capable of more colors than 8-bit displays. It simply looks much better than 8-bit displays and definitely has a much brighter screen than they have as well, which is the other part of HDR. It definitely looks better than my Sony Bravia XBR-49X800H TV which is definitely a 10-bit display. However, that might be due more to the greater brightness and contrast capabilities more than color depth. So who knows, maybe it is 10-bit. At the very least it likely has an extended color depth beyond sRGB and 8-bit displays, but no clue whether it has as much color depth as a 10-bit display has.
 
  • Sad
Reactions: AmazingTechGeek
I haven’t been able to find out what the colour depth is.

For a still or video? I assume 12-bit refers to bit depth.

IMG_1263.jpeg
 
Last edited:
I found this article on Apple’s website:
Traditional computer displays and HDTVs support a limited color space that’s based on a decades-old industry standard called Rec. 709. Rec. 709 devices (and the video content created for display on them) have standard-gamut color, the constrained color palette you see whenever you view a broadcast HDTV show, DVD, or Blu-ray disc.

A more recent generation of displays—including 4K televisions and computer displays, Apple TV 4K, and newer Mac, iOS, and iPadOS devices—can render a much wider palette of colors. These wide-gamut color devices display more vivid and lifelike hues (in addition to all the hues that standard-gamut devices can display). Accordingly, the video industry has adopted a wide-gamut color standard called Rec. 2020. Although most currently available wide-gamut devices support only a subset of the colors contained in the full Rec. 2020 specification, future imaging devices should be able to render more and more of those hues.
Apple lists the iPhone 13 Pro Max as a wide color gamut device. The definitive answer would be if we could find a color gamut test on the iPhone 13 Pro Max that lists the Rec. 2020 uv measurement. Any measurement of 67% or higher coverage of Rec. 2020 is considered wide color gamut, thus 10-bit color.

I’m still not sure why Apple is so cryptic about it though…
 
Post #4 seems to state that Pro Res is 12 bit. Am I missing something?
If you are talking about Pro Res video, it has nothing to do with the display or the color depth. It’s like a TV that says it has HDR, but it maxes out at 500 nits. That is not sufficient for HDR regardless if it’s an 8-bit or 10-bit panel, so it’s really fake HDR.

So that marketing is questionable when Apple doesn’t mention if they are using an 8-Bit or 10-Bit panel. The iPhone 13 Pro Max doesn’t match the color accuracy of my Sony X950H in films, but that may also be streaming on phone vs UHD Blu-ray which is how I watch my movies.

Streaming is inferior.

Now Dolby Vision HDR is capable of 12-bit, but I believe Apple uses HLG HDR and send that as Dolby Vision, at least for their video recording.
 
Last edited:
The screen shot I clipped is a setting for Photos, not video.
A ProRaw file might be 12-bit, but that doesn't mean the iPhone 13 Pro Max display is 12-bit by any means. You can view 16-bit images on an iPhone 13 Pro Max but they certainly don't have a 16-bit display. If the display is 8-bit or 10-bit, then a 16-bit file (image or video) will only be displayed using the color depth of the display. That is why you sometimes see color banding on lower color depth displays when viewing files that have a higher color depth. The lower color depth displays still have to be able to display the higher color depth files, else customers would be extremely angry, but they do so with much fewer colors which tends to make the image or video look less vibrant.

There are tricks that help higher color depth files display better on lower color depth displays. For example, a JPEG can only be 8-bit at best. However if you create the JPEG from a 16-bit RAW, TIF, or PNG file and embed DCI P3 (Display P3) color codec inside of it, it will look as though it has a higher color depth when viewed on Apple displays that use DCI P3. You only get 8-bit of color (16.7 million colors), but those can get a lot closer to looking like 16-bit due to clever software. But what you really are getting is still just 8-bit, and placed beside the 16-bit version it will be evident that the 16-bit file is better. So it is a pseudo 16-bit looking image hiding in a smaller 8-bit file.
 
Last edited:
  • Like
Reactions: Bishope1999
If you are talking about Pro Res video, it has nothing to do with the display or the color depth. It’s like a TV that says it has HDR, but it maxes out at 500 nits. That is not sufficient for HDR regardless if it’s an 8-bit or 10-bit panel, so it’s really fake HDR.
HDR is more than a brighter screen. In order to be HDR the display has to be capable of increased dynamic range over that of SDR, but it also needs to be capable of 10-bit color depth (1 billion colors) as well. As far as the 500 nits comment, that is purely subjective. Most experts say brightness needs to be 400 nits are greater, but again that is subjective. There are displays with 500 nits that look better when viewing HDR content than displays that have 1000 nits brightness. Why, because they are better designed and don’t try to rely on brightness alone. I wouldn’t call a display with 500 nits brightness and 10-bit color “fake HDR”. It is true HDR, but likely not as good as a well designed display with 1000 nits of brightness.
 
  • Like
Reactions: Bishope1999
HDR is more than a brighter screen. In order to be HDR the display has to be capable of increased dynamic range over that of SDR, but it also needs to be capable of 10-bit color depth (1 billion colors) as well. As far as the 500 nits comment, that is purely subjective. Most experts say brightness needs to be 400 nits are greater, but again that is subjective. There are displays with 500 nits that look better when viewing HDR content than displays that have 1000 nits brightness. Why, because they are better designed and don’t try to rely on brightness alone. I wouldn’t call a display with 500 nits brightness and 10-bit color “fake HDR”. It is true HDR, but likely not as good as a well designed display with 1000 nits of brightness.
Yes it can display HDR, but it will not be able to hit the highlights to properly produce the intended HDR. While the dynamic range will be displayed, the TV itself won't have enough brightness to properly display the intended HDR.

For example. On a 500 nit display, the film may be show an outdoor scene with the sun on the right side. Generally, the brightness will look the same across the screen because it have the brightness for the specular highlights. With a higher nit display, the reflection of the car can be 200 nits while the sun, in the same image, is hitting 1,000 nits.

You actually need that range, along with the wide color gamut and the 10-bit panel to properly display an HDR video on a TV, which is different than an HDR picture.

Admittedly, true fake HDR are the TV's that receive an HDR input but are 8-bit panel and they don't have a wide color gamut.
 
AFIAK, at least with the computer lineup the only true 10 bit display apple sells is the Pro Display XDR. And even that I had to find the specs buried in a technical white paper. Everything else is either 8-bit or 8-bit + FRC. The later which can produce more colors than 8-bit by using temporal dithering (and also causing eye strain in some people).


Honestly, I have no idea if the iPhone OLED displays use the same gimmick to get 10 bit color out of 8 bit displays like the LCDs do or if they are actually 10 bit. There was mention of a 10 bit image processing pipeline when the iPhone 12 came out.
 
AFIAK, at least with the computer lineup the only true 10 bit display apple sells is the Pro Display XDR. And even that I had to find the specs buried in a technical white paper. Everything else is either 8-bit or 8-bit + FRC. The later which can produce more colors than 8-bit by using temporal dithering (and also causing eye strain in some people).


Honestly, I have no idea if the iPhone OLED displays use the same gimmick to get 10 bit color out of 8 bit displays like the LCDs do or if they are actually 10 bit. There was mention of a 10 bit image processing pipeline when the iPhone 12 came out.
Do you know which devices are 8 bit + RFC?
 
Do you know which devices are 8 bit + RFC?
With the phones no, I suspect all devices newer than the iPhone 6s have some level of dithering as thats when they started supporting "billions" of colors. With the computers, I think the only display that is true 10 bit is the pro-display XDR.
 
A ProRaw file might be 12-bit, but that doesn't mean the iPhone 13 Pro Max display is 12-bit by any means. You can view 16-bit images on an iPhone 13 Pro Max but they certainly don't have a 16-bit display. If the display is 8-bit or 10-bit, then a 16-bit file (image or video) will only be displayed using the color depth of the display. That is why you sometimes see color banding on lower color depth displays when viewing files that have a higher color depth. The lower color depth displays still have to be able to display the higher color depth files, else customers would be extremely angry, but they do so with much fewer colors which tends to make the image or video look less vibrant.

There are tricks that help higher color depth files display better on lower color depth displays. For example, a JPEG can only be 8-bit at best. However if you create the JPEG from a 16-bit RAW, TIF, or PNG file and embed DCI P3 (Display P3) color codec inside of it, it will look as though it has a higher color depth when viewed on Apple displays that use DCI P3. You only get 8-bit of color (16.7 million colors), but those can get a lot closer to looking like 16-bit due to clever software. But what you really are getting is still just 8-bit, and placed beside the 16-bit version it will be evident that the 16-bit file is better. So it is a pseudo 16-bit looking image hiding in a smaller 8-bit file.
Re: “16-bit due to clever software" who are you trying to dupe here? 16-bit is medium format, phone santa😅😂🤣 The best apple display can only produce 8 + 2 bits of color. Go educate yourself
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.