Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Dutch60

macrumors regular
Original poster
May 18, 2019
221
80
Sorry, I was too direct in my earlier posts and caused/showed confusion.

Based on the gradient png linked by @Superhai, and looking mostly at the grey, by eyeballing my results are:

Does display 10 bit: Photoshop 2022, Lightroom Classic, Pixelmator Pro, Acorn, Aurora HDR 2018, Apple Photos, Safari, Preview.
Only 8 Bit display: Affinity Photo (that surprised me), Digikam, Brave, Firefox.
All the photo editors above can process in 16 bit or better (not sure about Photos).
I have deliberately included some browsers and Preview.
With exception of Aurora HDR 2018, all apps with current version running on macOS 12.5.1 and iMac 2019 27".

So my recommendation for a suitable 10-bit display photo editing app is Pixelmator Pro which is functionally similar to photo editing with Photoshop. But maybe test Photos first as it is free.
No problem with your directness. I was confusing as well; I mean editing in 10-bpc / 10-bits per colour; not merely viewing.
 

bogdanw

macrumors 603
Mar 10, 2009
6,114
3,021
No problem with your directness. I was confusing as well; I mean editing in 10-bpc / 10-bits per colour; not merely viewing.
10-bit is not a standard for images.
Photoshop doesn’t allow you to create 10-bit images, it’s 8, 16 or 32.
NewImagePhotoshop.png
Here is another explanation and photo editor - Krita
https://docs.krita.org/en/general_concepts/colors/bit_depth.html
https://krita.org/
 

Dutch60

macrumors regular
Original poster
May 18, 2019
221
80

I think it does(?)
I don' t use Photoshop (or anything from Adobe apart from AdobeRGB colour space :) )
So, you' re probably more qualified to judge this.
 

Dutch60

macrumors regular
Original poster
May 18, 2019
221
80
See the image above and
Optimize performance Photoshop https://helpx.adobe.com/photoshop/kb/optimize-photoshop-cc-performance.html
“30 Bit Display: Enable to increase the color fidelity on a monitor which supports 30 bit.”
How to Optimize Photoshop Preferences for Better Performance
I don' t understand your post. Please clarify?
I' m not trying to optimize performance of Photoshop...I don' t use it.

From the PetaPixel article above:

"Voila! You should now be able to edit your photographs with billions of colors instead of the 16.8 million colors you get with 8-bit color. Enjoy"
 

bogdanw

macrumors 603
Mar 10, 2009
6,114
3,021
I’m sorry I can’t explain better. I’ll let someone else.
 
Last edited by a moderator:

gilby101

macrumors 68030
Mar 17, 2010
2,946
1,630
Tasmania
No problem with your directness. I was confusing as well; I mean editing in 10-bpc / 10-bits per colour; not merely viewing
Got you. Editing first:
Every time you do an edit, you loose a little quality. This is most obvious with 8-bit images for which it is easy to get increased posterisation after a few edits using 8 bit per colour editing. But it applies to all editing. So if the intention is to display with a colour resolution of x bits, it is advisable to edit with more than x bits to avoid losing quality due to the editing process. So to achieve best results where the display is 10-bits per colour, you should edit at something better than 10 bits.

I suggest you edit in 16 bits per colour. That will ensure that you get the best results when viewing at 10 bits (or 8 bits) per colour.

And viewing:
You do need to make sure that the editing software and, if different, the display software are capable of presenting 10 bits. The viewing part is important too.

Caveat:
What I have said applies to most photography. But if your images are from a specialised field (e.g. astronomy or radiology) you need more specialised advise.
 

cupcakes2000

macrumors 601
Apr 13, 2010
4,035
5,425
Every time you do an edit, you loose a little quality.
This isn’t necessarily true, as you can (should?) be editing photos taken with a lossless format such as RAW and editing that. Certainly someone looking for this type of advanced editing ability would be already shooting in such a format. There is no loss of quality with any amount of editing.
 

gilby101

macrumors 68030
Mar 17, 2010
2,946
1,630
Tasmania
This isn’t necessarily true, as you can (should?) be editing photos taken with a lossless format such as RAW and editing that. Certainly someone looking for this type of advanced editing ability would be already shooting in such a format. There is no loss of quality with any amount of editing.
RAW format does not guarantee no loss in processing. RAW is only lossless in the sense that the camera has taken the output of the sensors and put that into the digital image without any changes (and consequent loss of quality) due to processing in the camera. What you do later will almost always be lossy.

Processing in a photo app will introduce loss of quality. Lets say you have a camera which takes 14-bit RAW images. If you edit that on your computer with 8-bit processing, you will obviously lose lots of quality. So it would be normal to do any editing in 16-bit to minimise any loss due to editing.
 

cupcakes2000

macrumors 601
Apr 13, 2010
4,035
5,425
RAW format does not guarantee no loss in processing. RAW is only lossless in the sense that the camera has taken the output of the sensors and put that into the digital image without any changes (and consequent loss of quality) due to processing in the camera. What you do later will almost always be lossy.

Processing in a photo app will introduce loss of quality. Lets say you have a camera which takes 14-bit RAW images. If you edit that on your computer with 8-bit processing, you will obviously lose lots of quality. So it would be normal to do any editing in 16-bit to minimise any loss due to editing.
Most raw editors are non destructive, the edit sits in a sidecar file. There is no loss of quality because you haven’t touched the original image.

If you start to convert an image in some destructive way, then yes you can affect its quality. (Resize it, change the bit depth, change the file format etc). However, simply loading a RAW image into a RAW editor (Capture One, Lightroom, Camera Raw etc) and making edits doesn’t remotely affect the quality of the original file. You can edit a single RAW photo thousands of times, and that single RAW photo will remain the exact same quality.
 

Slartibart

macrumors 68040
Aug 19, 2020
3,142
2,817
Every time you do an edit, you loose a little quality. This is most obvious with 8-bit images for which it is easy to get increased posterisation after a few edits using 8 bit per colour editing.
You have any references for this?

As far as I know this is only true if you save the file using a file format with a lossy compression. If you open an 8bit - or actually whatever color depth, wether it is pseudo-color or >=24bit true color - image, edit a pixel, and save in a lossless format, there is just that pixel altered. If you change that pixel a hundred consecutive times and save the image again and again, there is no loss in “quality” - whatever that should be.

Of course changes occur when mapping the colors of an image with a certain color depth per pixel into a different color model or space. But as long as you use a lossless image format and keep the LUT and color space constant, there is no change to the image at all - besides the initial change when saving e.g. an "original” 14bit per channel RAW from your camera to an 8bit per channel pseudo-color uncompressed TIFF.

If under these conditions a visible or numerical change to the 8 bit image's tonal or color data occurs… well, then it’s time to look for a different image processing software. 🤓
 
Last edited:
  • Like
Reactions: arkitect

Superhai

macrumors 6502a
Apr 21, 2010
735
580
I think it does(?)
I don' t use Photoshop (or anything from Adobe apart from AdobeRGB colour space :) )
So, you' re probably more qualified to judge this.
Almost all of the modern photo editing software let you chose only 8-bit, 16-bit, and some 32-bit for editing. For display there are very few true 10-bit panels, but I am sure your Eizo are one of the few. Many displays who claim to support 10-bit or more, are actually only 8-bit, and using spatial or temporal dithering or both to accomplish so you perceive more colors. Even "8-bit panels" are sometimes only 6-bit or less, although not common anymore.

Also even if photo editing software say they edit in 16-bit, they will temporarily lower quality to speed up preview processing (which is why you could see banding or other artifacts, but not in the final product), and sometimes they do destructive processing down to maybe 12-14 bits just because it needs it for overhead or it is to expensive for the processing.

And then it is file format support, which is also something to aware of. JPEG for instance technically is able to support 12-bit colors, but almost no software are taking advantage of it.

On top of that various colorspaces are mapping differently so certain gradients could look better, but still be 8-bit.

My suggestion is use the software that supports 10-bit display and edit in 16-bit to take advantage of it, as long as you have the horsepower and space available, and your entire workflow support it.
 

Dutch60

macrumors regular
Original poster
May 18, 2019
221
80
Almost all of the modern photo editing software let you chose only 8-bit, 16-bit, and some 32-bit for editing. For display there are very few true 10-bit panels, but I am sure your Eizo are one of the few. Many displays who claim to support 10-bit or more, are actually only 8-bit, and using spatial or temporal dithering or both to accomplish so you perceive more colors. Even "8-bit panels" are sometimes only 6-bit or less, although not common anymore.

Also even if photo editing software say they edit in 16-bit, they will temporarily lower quality to speed up preview processing (which is why you could see banding or other artifacts, but not in the final product), and sometimes they do destructive processing down to maybe 12-14 bits just because it needs it for overhead or it is to expensive for the processing.

And then it is file format support, which is also something to aware of. JPEG for instance technically is able to support 12-bit colors, but almost no software are taking advantage of it.

On top of that various colorspaces are mapping differently so certain gradients could look better, but still be 8-bit.

My suggestion is use the software that supports 10-bit display and edit in 16-bit to take advantage of it, as long as you have the horsepower and space available, and your entire workflow support it.
Sorry for this late response (have been away for a couple of weeks). Thank you for your advice. DXO PL6 now has a "wide" colour space (wider than AdobeRGB). I didn't t buy it yet, but I certainly will. Not sure about the 10-bit support yet.
 

Lothar29

macrumors newbie
Jan 3, 2023
2
0
Sorry for this late response (have been away for a couple of weeks). Thank you for your advice. DXO PL6 now has a "wide" colour space (wider than AdobeRGB). I didn't t buy it yet, but I certainly will. Not sure about the 10-bit support yet.
Im not using mac, but maybe this helps. I Develop RAW with dxo into tiff16 Bit. Then I use acdsee ultimate (2019) for further editing of the tiffs. A Pictures of a Sunset with a smooth Gradient ist perfect, If I Look at my 8+2 monitor with the TIFF. It is also perfect converted into jpeg2000 (16bit). Converted to JPEG 8 Bit) I can see banding.
My conclusion ist, that ACDSEE ist able to display the 16bit pictures on a 10 Bit Level. This ist valid for Windows 10, but give it a try..... (BTW, Windows Foto App also Displays a smooth Gradient with the TIFF, and Not with the JPEG).
 

Lothar29

macrumors newbie
Jan 3, 2023
2
0
I am so sorry... I had some more time today and also tried some other conversions of the original RAW in dxo and looked at the results in acdsee, dxo and Foto App windows. There is only a visible banding in the jpeg-8bit. There is NO banding in a tiff-16 or tiff-8 or jpeg2000-16. Because there is no difference between tiff-8 and-16 it is seems, that ACDSEE does not use the 10 (8+2) bit of my display. It seems that the banding results from the compression of the jpeg (though I used "100% quality"). Quite interesting to see that jpeg2000 does not show any banding with this picture, probably because of its superior compressing algorithm. Jpeg is a quite outdated format and it is hard to understand, why nobody uses jpeg2000.
To make confusion complete,: I also used the 10 bit test from Eizo
https://www.eizo-apac.com/support-service/tech-library/general-monitor-support-and-faq/monitor-test and here I can see a slight difference,
So it seems that at last with my Dell 8+2bit it does not work. Or only a little bit:)
 

Dutch60

macrumors regular
Original poster
May 18, 2019
221
80
No problem Lothar29. I didn' t give it a try, yet.
Good that you did ;-)
I used those test images before.
I can clearly see a difference between those 16 and 8-bit files on my Eizo CG2700S (10-bit display).
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.