Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

nol2001

macrumors regular
Original poster
Aug 15, 2013
176
290
UK
Is there a chance the new Mac Pro will output 10-bit colour in OS 10.9? I'm not sure if 10-bit is part of the 4K specifications or not.
 
Not happening.

I filed many, many bug reports on this and multiple other colour management issues for 10.9. None of the colour management bugs were even acknowledged. After the 4th bug report requesting 10-bit colour, I got a response that very bluntly stated 10.9 would not support it and there were no plans to add it.

Maybe something will have changed for the new Mac Pro (I wouldn't put it past Apple to enable 10-bit colour only for that machine), but for existing machines, I think we're SOL.

-SC
 
Apple supports 24-bit color output actually, 16.7 million colors on an external display. What is 10-bit, like a small upgrade from 8-bit which is only 256 colors...
 
Apple supports 24-bit color output actually, 16.7 million colors on an external display. What is 10-bit, like a small upgrade from 8-bit which is only 256 colors...

No... it's 30-bit color (10-bits per channel) and allows for 1024 shades per primary color for a total of 1.07 billion colors.
 
No... it's 30-bit color (10-bits per channel) and allows for 1024 shades per primary color for a total of 1.07 billion colors.

So as someone who doesn't keep a close eye on this segment of the market... What is the specific gain here? Does hitting 100% visible gamut coverage require it? Are there certain hues that 8-bit channels can't represent correctly due to the lack of resolution between "0.0f" and "1.0f"? Or are we just slicing up the range thinner and thinner because it is now possible?

I'm honestly curious, since the last research I saw for developers on the subject was over a decade ago, and it seemed happy with what 24-bit RGB could do.
 
So as someone who doesn't keep a close eye on this segment of the market... What is the specific gain here? Does hitting 100% visible gamut coverage require it? Are there certain hues that 8-bit channels can't represent correctly due to the lack of resolution between "0.0f" and "1.0f"? Or are we just slicing up the range thinner and thinner because it is now possible?

I'm honestly curious, since the last research I saw for developers on the subject was over a decade ago, and it seemed happy with what 24-bit RGB could do.

No, the gamut is called the color space. This is even more important. Below is a a CIE chart (what the average eye can see) with Rec.709 (HDTV) and Rec.2020 (UHDTV/aka 4K). As you can see the color space will be much fuller but still no where near "full".

Bit depth is about the number of increments between intensities of a color. The more bits the "smother" the changes can be.

Screen%20Shot%202013-10-05%20at%203.45.13%20PM.jpg
 
No the gamut is called the color space. This is even more important. Below is a a CIE chart (what the average eye can see) with Rec.709 (HDTV) and Rec.2020 (UHDTV/aka 4K). As you can see the color space will be much fuller.

Bit depth is about the number of increments between intensities. The more bits the "smother" the changes can be.

Yes, that much I do remember, but you don't actually answer my question. What tangible benefit are we getting from 10-bit beyond the obvious one of having better resolution in each channel? If research was showing that 24-bit was already producing more resolution on a color than the eye can distinguish, I'm assuming the research is wrong or there is a side-benefit of going 10-bit that helps these other concerns.
 
No, the gamut is called the color space. This is even more important. Below is a a CIE chart (what the average eye can see) with Rec.709 (HDTV) and Rec.2020 (UHDTV/aka 4K). As you can see the color space will be much fuller but still no where near "full".

Wait a sec, I'm seeing that chart on my computer screen, and you're saying everything outside the triangles can't be displayed... so it's like how much more colorful could this be?

And the answer is none. None more colorful.
 
Yes, that much I do remember, but you don't actually answer my question. What tangible benefit are we getting from 10-bit beyond the obvious one of having better resolution in each channel? If research was showing that 24-bit was already producing more resolution on a color than the eye can distinguish, I'm assuming the research is wrong or there is a side-benefit of going 10-bit that helps these other concerns.

That's what I remember reading too. That the human eye can't distinguish more than 16.7Million colors so computers displaying more was pointless. Perhaps this is for the benefit of other animal species that can see further into the UV and IR spectrum.;)
 
Yes, that much I do remember, but you don't actually answer my question. What tangible benefit are we getting from 10-bit beyond the obvious one of having better resolution in each channel? If research was showing that 24-bit was already producing more resolution on a color than the eye can distinguish, I'm assuming the research is wrong or there is a side-benefit of going 10-bit that helps these other concerns.

All colors on the screen are made by mixing red, blue and green in different intensities. For simplicity sake lets say you have 10 different intensities of each (green 1-10,....). You would be very limited on the number of combinations you could make. Just one change would create a large step that would be noticeable. As a green object (our yes are most sensitive to green) went from green 8 to green 7 you would see a division line as the objects color changes intensity.

More steps means a smother colors. 8 bit only allows 256 different intestines for each color. 10 bit gives you 512 steps.

Wait a sec, I'm seeing that chart on my computer screen, and you're saying everything outside the triangles can't be displayed... so it's like how much more colorful could this be?

And the answer is none. None more colorful.
Hope you are not serious.:confused: The colors on the screen are not accurate and is for illustration purpose. Do you expect the colors outside of the triangle to just be black because they fall outside of the device's ability to reproduce them 100% accurately?:D Also unless you monitor is ISF or THX professionally calibrated the picture will be even less accurate.

You must have a certified printed card on hand in order to accurately see them as they truly are.
IMG_1200.jpg
 
You can get 10 bit support with an external box - Blackmagic, Aja, etc - even on a MacBook Air now through ThubderBolt . I think Apple gave up having OS X support 10 bit natively a long long time ago, and maybe their reasoning is that it's a "pro" demand, and those pros can use those external devices ? You do need a 10 bit compatible monitor after all.
 
I'll go so far to say that the only reason APPL did not give us 10-Bit support is because they do not make the hardware/monitors to support it .... Yet?
 
There is a lot people who has worked with more than 8bits/channel for more than a decade. Taking pictures in RAW format will give you at least 10, and with a good camera as much as 14bits/channel.

It has been possible to edit images with 16bits/channel in Photoshop for many years, today even 32bits/channel (HDR), even though many filters do not supported it. The main benefit is that it adds a lot of flexibility in the editing. You can not do much editing with a 8bit/channel jpeg-image before you start to severely degrade the image, a 16bit/channel tiff-image can keep up much better.

For our visual perception 8bit/channel works perfectly well for an evenly lit subject. However, in situations with high contrasts, not even 14bit/channel raw images is a match for the human eye. Our eye certainly cheats a little, adapting to the intensity levels we are focusing at. But never the less we appear to see details in both the extremely sunlit wall as well as in the deep shadow, whereas the camera don´t.

It is a complicated subject, with several factors involved. But, as pointed out, neither color space nor dynamic range of modern displays is yet close to the human eye.
 
10bit is more for the Medical imaging market.

Anyone else has no need for it. And the Medical imaging market does not use apple products for that purpose.
 
10bit is more for the Medical imaging market.

Anyone else has no need for it. And the Medical imaging market does not use apple products for that purpose.

Doesn't 4K video (shown on televisions) use 10-bit colour though? That is what I was wondering might make it mainstream enough to be on the new Mac Pro.
 
10bit is more for the Medical imaging market.

Anyone else has no need for it. And the Medical imaging market does not use apple products for that purpose.

Please... That is nonsense, If a person can appreciate the difference between a higher bit count than the persons will is free to acquire it! Its not a MRI scanner!

How would you compare FLAC audio to MP3??
 
10bit is more for the Medical imaging market.

Anyone else has no need for it. And the Medical imaging market does not use apple products for that purpose.

Oh, OK.

Can you please explain why HP, NEC, and Eizo all manufacture and market 10-bit displays aimed at high end graphics designers? I mean, clearly they don't need this capability because you say they don't.

-SC
 
10bit is more for the Medical imaging market.

Anyone else has no need for it. And the Medical imaging market does not use apple products for that purpose.

It's good of you to decide what the rest of us need or don't need. Saves me having to think for myself.

Please can you choose some lottery numbers for me whilst you're at it?
 
What is the specific gain here?

That's what I remember reading too. That the human eye can't distinguish more than 16.7Million colors so computers displaying more was pointless. Perhaps this is for the benefit of other animal species that can see further into the UV and IR spectrum.;)

I assure you humans can notice. If you have a Blu-Ray player, it is very simple to demonstrate. Get Planet Earth. Watch opening menu. See the visible banding in the blue shades? That's because blu-ray is only 8-bit per color. 10-bit per color would fix the very visible and obvious banding issue.

Any time you have a rich gradient in a single color you'll see banding unless there is post-processing applied such as dithering, but then you are causing inaccuracy and loss of sharpness and detail. (So if you aren't seeing the banding, it's because you have a video processor that's applying dithering or some such other smoothing algorithm.)
 
Yes, that much I do remember, but you don't actually answer my question. What tangible benefit are we getting from 10-bit beyond the obvious one of having better resolution in each channel? If research was showing that 24-bit was already producing more resolution on a color than the eye can distinguish, I'm assuming the research is wrong or there is a side-benefit of going 10-bit that helps these other concerns.

I could write pages on the topic. It's not about gamut coverage at all. That is completely wrong. It lessens the need for dithering as banding can otherwise be more profound in displays that need to cover a wide gamut. It makes it easier to calibrate to a given target without undesirable side effects due to fewer combinations being truly required to render a perceptively continuous tone. The last thing is that rgb values are not evenly spaced. Shadows have a lower allocation of total values, so you could see a significant benefit when it comes to shadow detail assuming that reflections and coatings don't kill the lowest values. Do keep in mind that these things become increasingly significant if you have to deal with the correlation of multiple displays of various age ranges. Overall it's just one detail. It's not a panacea to all monitor problems.

The mac pro speculation was probably due to the fact that many firepro drivers support full 10 bit paths on windows under certain applications. I don't expect Apple to go that route though. Thunderbolt previously didn't support it. Thunderbolt 2 will, but I doubt it would be supported at 4k if those things actually hit mainstream markets in the near future.


10bit is more for the Medical imaging market.

Anyone else has no need for it. And the Medical imaging market does not use apple products for that purpose.

I would like to know your reasoning for this. The medical market has specific requirements. There are exceptionally high resolution displays at 17-24" that cater specifically to the medical market, but that doesn't make it entirely useless elsewhere. Even if the OS cannot completely accommodate it, there may be some benefit with displays that set their own internal LUTs as the total possible range of values is in that case set on the display end with the ICC profile merely depicting its behavior for other "managed" applications. ICC in itself is kind of a bad system, but that's another topic.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.