Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Burn-in is a function of brightness intensity and time. For typical OLED TV use, sure it's usually not a huge issue because the content is dynamic and a given area of the screen will not remain very bright constantly, except maybe the channel logo in a corner which modern OLEDs are programmed to identify and darken and that will typically change place as you change channel.

For desktop use, the situation is much different. Typical desktop UI and computer applications UI are much more static than TV content and have plenty of static elements which will be displayed for a long time at exactly the same place. Burn-in will happen much more quickly under those conditions than by watching varied content on TV.
Both my LG Ultrafine 5k panels - IPS panels - have significant image retention/burn in around the edges. Practically, how much worse can OLED burn-in be?
 
I must admit I'm concerned about OLED on my Mac where the screen is left on for 7-8 hours a day while working.

It's acceptable on my iPhone and Watch because I just don't interact with them that much or leave them idle on a static screen. That's not the case with my computers.

I recently purchased an OLED desktop monitor last year and after a full day of work, it brought up a warning dialogue box that it needed to perform some kind of pixel refreshing to maintain the display and stop burn-in. (Asus PG42UQ).

I don't expect Apple to bring up such dialogues, but I do expect the computer to dim when it thinks you're not looking at the screen anymore like with the iPhone and for me I don't think that's acceptable.

I would much prefer they continue with MiniLED-based FALD backlighting since there's no risk of burn-in. They just need to increase the zone count vastly from the current 2,500 zones with 10,000 LED's to maybe 10,000 zones with 40,000 LED's.

You should use screensavers and have them kick in after 5 minutes of no use. Also, considering power saving features that turn off/sleep monitor after so many minutes.
 
Until burn in is solved then OLED is not good for monitor usage. I really don't understand why people crave OLED for their monitors. Are you all locked in dark rooms then you need OLEDs?

Will be good to have a 32" iMac with OLED. Hopefully Apple will not price it too high with the OLED display.
 
Is a regular ratio 42 going to be that useful sitting <1m away from eyes on a desk? Going to be quite a stain on neck. Surely an ultra wide 42 would be better.
 
Both my LG Ultrafine 5k panels - IPS panels - have significant image retention/burn in around the edges. Practically, how much worse can OLED burn-in be?

Image retention and burn-in are typically used to describe two different issues. An IPS panel can have image retention, which is temporary, but should not have burn-in, which is permanent damage. This means you can usually restore an IPS showing image retention since the underlying pixels should not be permanently damaged.

OLEDs on the other side in general degrade the brighter and longer they remain lit. If some parts of the panel constantly show the same bright image when the rest of the panel shows more dynamic or darker content, the OLEDs which compose the affected pixels will degrade faster than the rest of the panel, leading to that area becoming visibly damaged over time. This kind of damage is permanent.
 
Crazy. Four years is a long time especially given how long OLEDs have been out for already. Apple really doesn't do anything in a timely manner.
 
I really don't understand why people crave OLED for their monitors. Are you all locked in dark rooms then you need OLEDs?

I presume it is for the strengths of OLED, which include fantastic HDR thanks to pure blacks (so no "grayish blacks" like FA-LED) and no ghosting/haloing around bright objects (like MiniLED).

I have had an OLED TV since 2017 and would never buy anything else, and I watch it in ambient outdoor room light with no real issues in terms of brightness.

OLED add another 1k price.

The Bill of Materials would not justify such an increase over LED, but Apple's margins could. :p
 
I get that for TV as we watch movies usually in the dark but for daily computer monitor its kinda lost. Anyone that works on a monitor in pitch black is ruining his/her eyesight.

I think OLED is still very flawed tech for monitors. The perm burn is just an issue that none of us want s to deal with.

MicroLED is probably 7-10 years away from monitor screens. I think better implemented MiniLED is the way to go for now.


I presume it is for the strengths of OLED, which include fantastic HDR thanks to pure blacks (so no "grayish blacks" like FA-LED) and no ghosting/haloing around bright objects (like MiniLED).

I have had an OLED TV since 2017 and would never buy anything else, and I watch it in ambient outdoor room light with no real issues in terms of brightness.



The Bill of Materials would not justify such an increase over LED, but Apple's margins could. :p
 
I also don't think OLED is a very good idea for a monitor display, as well, but we're presuming that such a display will use consumer WOLED or QD-OLED as found in TVs because that would be easiest. However, Apple could use AMOLED technology similar to what they use in the iPhone and Apple Watch, which has longer usable life and better resistance to burn-in.
 
However, Apple could use AMOLED technology similar to what they use in the iPhone and Apple Watch, which has longer usable life and better resistance to burn-in.

I don't think AMOLED are more resilient to burn-in per se. They can generally last longer due to the display being often off or often operating at a relatively low luminance, but this would be true for TVs too. Actually one of the main strategies to prevent burn-in is to dim either the whole display or parts of it which are identified as showing static bright images like logos or such.
 
A 42“ retina display would be 8K …

Or in layman’s terms: 🤤🤤🤤🤤🤤🤤🤤🤤
What kind of bandwidth would one need to run a display at 8K 120hz HDR 4:4:4? I don’t think anything exists at this moment that do it but Would we need 80Gbps or more like 160Gbp connection?
 
By 2027 the human race will be nestled into their cocoons, serving as batteries for AI. A fancy new screen won't matter none. ;)
 
  • Like
Reactions: Serqetry
Well I’d prefer my entertainment screen in my cocoon to be OLED since I’ll be spending a lot of time in there.
Just get the Apple VR goggle, should be gen. 4 by then, so probably better battery life, faster processor & GPU etc.
 
By 2027, there will be relatively cheap microLED TV Panels available, which are much better suited for a desktop monitor for obvious reasons. Hopefully this is just another stupid rumor straight from the **s.
 
A 32" monitor with a Macbook Pro would win over the possible of a new larger iMac for me. I know the cost will likely be quite a bit more, but the portability options just make more sense for me. Would love to see this rumor come true.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.