The problem is more that you can't display correct color values at higher brightness for SDR content because clipping will occur.
You will just end up clipping highlights and shadows leaving you with overexposed content because the color profile doesn't hold the required data. There is nothing whiter than white, you would just push anything near white into being fully white removing highlight detail in the progress. The crushed blacks will simply turn greyish, expose noise and remove contrast. The same goes for tones of red, green and blue.
Go read what nits and candelas are, how they are calculated, and then you won't make the mistake of comparing the measurements on a 16 inch screen versus 6 inch screen.
Most content are encoded in sRGB or rec. 709 with limited tonality. 8-bit and 256 shades.This is simply not true. HDR content simply allows for a wider range of color values to be displayed all at once, but there is no stopping your white value from getting brighter.
Wanna see a whiter white?
Wanna see a whiter white? CSS trick/bug to display a brighter white by exploiting browsers' HDR capability and Apple's EDR systemkidi.ng
^ if you are on a new MacBook, going to that page will show text that is white... and far brighter than the white that's currently on the display (even if the display is at max brightness). That's how you get white that is "whiter".
Nits measurement is still comparable between a 16-inch screen vs a 6-inch screen. Nit as a unit means candelas per meter squared. It is independent of display size.
I didn’t realize monitor/display hoods still existed as I haven’t seen any since CRTs were common. An interesting evolution.Perhaps time to visit the optometrist to ensure you don't have cataracts developing? 1000 nits is extremely bright when used indoors, unless you are sitting directly in front of a window looking at the sun.
There *is* a use case for having very bright displays when used outdoors, e.g. for camera screens. However, this is very much an edge-case for most laptop computer usage. If you really do need to use your laptop in very bright surroundings there are "sun-shades" to help improve visibility.
![]()
![]()
![]()
Do you wear sunglasses?I personally prefer the maximum brightness output possible even in a dimly lit environment. I would use 1000 nits to surf the web
Most content are encoded in sRGB or rec. 709 with limited tonality. 8-bit and 256 shades.
I'm not arguing that white cannot be brighter. I'm saying you will lose detail in most content because the color profile doesn't retain information if pushed.
Nits measurement is still comparable between a 16-inch screen vs a 6-inch screen. Nit as a unit means candelas per meter squared. It is independent of display size.
And I'm saying you are mistaken. Color profile has nothing to do with display brightness. The fact is that you don't lose details with your sRGB content going from 10% display brightness to 50% and then to 100%.
Making the display go from 500 nits to 1000 nits is just like making it go from 100% to now, say... 200%. Your sRGB and Rec. 709 contents will still map properly.
Any ophthalmologist or optometrist will tell you that the eye is one of the most complex and most useful organs in the human body, the complexity of which is on par with the human brain. To be very carefully used, and certainly not abused.I personally prefer the maximum brightness output possible even in a dimly lit environment. I would use 1000 nits to surf the web
Presumably, this is similar to "snow blindness"?When the retina's light-sensing cells become over-stimulated from looking at a bright light, they release massive amounts of singling chemicals, injuring the back of the eye as a result.
Also the high contrast between a bright screen and dark surroundings may cause eyestrain or fatigue that could lead to a headache.
but...but....someone is wrong on the Internet! It's ....Wow, some folks use their Macs in extreme ways. I just use it and get the job done. If its not affecting 7 billion other people, I think you will live.
sRGB targets 80 nits.
Most colourists grade at 120 nits.
HDR videos don't brighten a full image to 1000+ nits. Ideally only the areas of the image where it is needed otherwise their grading would differ too much on the various displays and cinema screens.
Candela per square metre - Wikipedia
en.wikipedia.org
You sound like you just want to make things up as you go along.
The Dolby Vision format is capable of representing videos with a peak brightness up to 10,000 cd/m2 and a color gamut up to Rec. 2020.
Such arrogant, patronizing comments really add nothing useful to the conversation.Wow, some folks use their Macs in extreme ways. I just use it and get the job done. If its not affecting 7 billion other people, I think you will live.
sRGB targets 80 nits.
While one would theoretically use the viewing conditions which represent the actual or typical viewing environment, if this is done with 24 bit images a significant loss in the quality of shadow detail results. This is due to encoding the typical viewing flare of approximately 5.0 percent into a 24 bit image as opposed to the encoding viewing flare of 1 percent. Therefore we recommend using the encoding viewing environment for most situations including when one's viewing environment is consistent with the typical viewing environment and not the encoding viewing environment.
The encoding ambient illuminance level is intended to be representative of a dim viewing environment. Note that the illuminance is at least an order of magnitude lower than average outdoor levels and approximately one-third of the typical ambient illuminance level.
Was Bruce Willis in any of them? If so, I have questions.Why are you telling me what a nit means? Are you my video editing and colourist teacher from 1996 and every year since? In your lifetime you have watched movies, commercials and seen posters and images graded by me.
By the way, this is fundamentally misleading. Here's the sRGB reference viewing condition:
View attachment 1930527
Source: https://www.w3.org/Graphics/Color/sRGB.html
Again, not everyone sits in a dark room to view sRGB images on their computers at 80 nits.
Was Bruce Willis in any of them? If so, I have questions.
Don't contest with nonsense because you are actually contesting thousands of professionals like me just so you can support some idiotic thread created by someone who thinks our displays should be set to 1000 nits minimum. What utter madness.
This is the third time someone has made a thread with the same suggestion and I am one of dozens of members who have said the same thing.
We grade for SDR and print between 80-120 nits. We cannot have a default or minimum 1000 nits.
We grade HDR a little higher but we use a preview monitor to visualise how it would look on different displays. We cannot have a default or minimum 1000 nits.
There's no contesting this. You're not a pro if you try to. Give up.