Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Generally agree, it's just that unfortunately there is next to no choice of ~220ppi monitors, and while the choice at ~110ppi is much better you can't find them at all sizes, like a 32" with ~110ppi would give a lot of screen real estate with pixel-perfect rendering and an affordable price. But nobody makes one.

I don't understand this obsession with pixel perfect rendering. It was important when pixel density was low and pixel aliasing artefacts were obvious. Now have affordable displays with pixels so small that there are indiscernible by human vision at usual viewing distances. At 110ppi desktop pixels are still visible to most people. At 130-140ppi they become practically indiscernible. Anything beyond that is a luxury bonus.

Any 27" or 32" 4K display currently sold at the market should work just fine with macOS. Sure, it will look like crap compared to a MacBook Pro Display, but that's just the difference in panel quality + color gamut + refresh rate.
 
Last edited:

pshufd

macrumors G4
Oct 24, 2013
10,145
14,572
New Hampshire
So in Apple´s World a 4K should be 20" or 40"

Thats why i choosed a 27" 1440p as i am older and my eyes not getting any better i could not find Pixel Problems or uncharpnes in this Kind of Resolution.

5 K 27 Inch Retina
5120 x 2880

2K 27Inch
2560 x 1440


View attachment 2111514
View attachment 2111517

The QHD displays on the 2009 and 2010 iMacs are sharp and clear, and I have one QHD Dell 25 inch that is sharp and clear. I just prefer a 27. I generally recommend 27 inch QHD monitors to people that don't need to do really fine work and the people that I've recommended them to are quite happy with them.
 
  • Like
Reactions: Basic75 and genexx

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
I don't understand this obsession with pixel perfect rendering.
I just want it to look decent. People complain about "winblows" but my cheap **** work laptop produces better results on my LG monitor then my overly expensive M1 MBP. If I spend 2,500 dollars on a laptop that is promoted as a luxury brand, then it stands to reason that using a typical monitor should work fine - I don't get how people blame the monitor, or the fact that I chose to buy a 500 dollar monitor instead of a 2,000 monitor when the 500 dollar model works great for 3 out of my 4 computers.
 

tmoerel

Suspended
Jan 24, 2008
1,005
1,570
The QHD displays on the 2009 and 2010 iMacs are sharp and clear, and I have one QHD Dell 25 inch that is sharp and clear. I just prefer a 27. I generally recommend 27 inch QHD monitors to people that don't need to do really fine work and the people that I've recommended them to are quite happy with them.
Yep I agree...but my 27" LG 4k screen looks sharp too. Even sharper than the QHD. So I don't understand where the problem is.
 

pshufd

macrumors G4
Oct 24, 2013
10,145
14,572
New Hampshire
Yep I agree...but my 27" LG 4k screen looks sharp too. Even sharper than the QHD. So I don't understand where the problem is.

4k monitors cost more though. I have 2k, 4k and 5k monitors but I have pretty good eyesight as I use two of my 4k 27 inch monitors at native resolution. The problem can be more how macOS interacts with the hardware as I get different results on scaled fonts between my M1 mini and the M1 Studio.

If you don't need more than 2k, then why buy 4k? Particularly if you can't tell the difference. There are some RAM and GPU costs to supporting higher resolution monitors as well.
 
  • Like
Reactions: genexx

leman

macrumors Core
Oct 14, 2008
19,516
19,664
I just want it to look decent. People complain about "winblows" but my cheap **** work laptop produces better results on my LG monitor then my overly expensive M1 MBP. If I spend 2,500 dollars on a laptop that is promoted as a luxury brand, then it stands to reason that using a typical monitor should work fine - I don't get how people blame the monitor, or the fact that I chose to buy a 500 dollar monitor instead of a 2,000 monitor when the 500 dollar model works great for 3 out of my 4 computers.

MacOS wants at least a 4K display with a "usual" aspect ratio (e.g. 16:10 or 16:9) in the usual 21" to 32" size. These are the displays that have high enough pixel density to support Apple's chosen rendering method. If your display has lower pixel density than that, text will look like crap, since low-res displays really need the subpixel-AA hack to look decent.

An entry-level 4K displays with those specs costs around $300. I don't think it's an unreasonable price point if you are already spending top dollar for a computer. And of course better quality displays will be both more expensive and will produce a better image.

P.S. I have a 32" 4K AOC display at home and it works very well. I paid around $500 for it (in USD equivalent). I have a 5K LG Ultrafine in the office and obviously that one looks better, but it's not a night and day difference. Of course, your mileage might vary. Some people in this very thread say that they notice a big difference between 4K and 5K running same virtual resolution. I don't.

P.P.S. There are also some displays that are improperly detected by Apple Silicon, but that's an entirely different issue.
 
  • Like
Reactions: Tagbert

Ethosik

Contributor
Oct 21, 2009
8,141
7,119
TL;DR

Go spend $2K+ on monitors you peasants. Because font subpixel rendering (which was removed from MacOS) will make any monitor look crisp and clear like Windows ClearType, is not acceptable and trashy alternative to ultra-expensive-elitist-only retina displays.
My $450 monitors look just fine. Although I don’t use HDMI, I use DisplayPort to USB-C
 
  • Like
Reactions: maflynn

leman

macrumors Core
Oct 14, 2008
19,516
19,664
TL;DR

Go spend $2K+ on monitors you peasants. Because font subpixel rendering (which was removed from MacOS) will make any monitor look crisp and clear like Windows ClearType, is not acceptable and trashy alternative to ultra-expensive-elitist-only retina displays.

Subpixel rendering is an expensive, cumbersome and complicated workaround, which was invented only because the display technology was immature. This workaround is not needed with modern high-resolution displays. If you want to use old equipment and old tech, macOS is not the best choice, nobody in their mind will deny the fact that Windows has better backwards hardware compatibility. But to portray Windows solution as superior just because it chooses to retain an overcomplicated and messy software stack to support cheap display panels is, well... an odd position to take.
 

pshufd

macrumors G4
Oct 24, 2013
10,145
14,572
New Hampshire
The state of monitors is somewhat of a mess and it explains why people go to forums and ask what works. What I see is that you need to explain the computer, the port, the monitor and even the cable or the results that one person gets may not be the same as another. That inconsistency leads to a lot of frustration because nobody likes to buy stuff and return it when it doesn't work.
 
  • Like
Reactions: leman

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
MacOS wants at least a 4K display
I have a fully functioning ultrawide monitor. Why should I basically create ewaste? While you can find cheap 4k monitors, finding a good or decent ultrawide pushes the price up considerably, like in the 1,000 to 2,000 dollar range. All because macOS is unable to properly render on 1440p? Seems ludicious that a high-end computer has difficulty running on a standard monitor.

btw, in googling I found this:
This M1 Mac display hack is a must for 1440p monitors

Its strange that people are so quick to say how bad pc/windows are (many times deservedly so) but if apple is doing something poorly, well its not their fault - seems crazy to me.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
I have a fully functioning ultrawide monitor. Why should I basically create ewaste?

An absolutely reasonable and understandable stance. Could Apple improve the situation for users like you? Undoubtedly. Is it worth the effort? Hardly. We are talking about quite a significant software stack complication just to to support users who want to use hardware Apple doesn't really care about. I think their position on only offering minimal bare-bones support in this cases can be justified as well.

Its strange that people are so quick to say how bad pc/windows are (many times deservedly so) but if apple is doing something poorly, well its not their fault - seems crazy to me.

There are not many technical decisions that do it all. Apple was always about curated experience, so they focus on a certain subset of configurations. Windows has traditionally focused on compatibility with as many configurations as possible. Their solution trades accuracy and software complexity to achieve this flexibility.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
I have a fully functioning ultrawide monitor. Why should I basically create ewaste?
You created e-waste by buying a monitor without a computer to run it.
While you can find cheap 4k monitors, finding a good or decent ultrawide pushes the price up considerably, like in the 1,000 to 2,000 dollar range.
Don't buy what you can't afford. Simple! Also don't buy what's just cheap, but not good enough.
All because macOS is unable to properly render on 1440p?
Of course it can. The only problem with low-PPI displays is that in HiDPI mode the UI becomes very large.
Seems judicious that a high-end computer has difficulty running on a standard monitor.
A standard monitor is one Apple recommends by selling it on their own website. All ultrawides are non-standard trash intended for the PC gaming market. If there is such thing as an industry standard aspect ratio, it is 16:9.
What a gross hack. And all just to keep a resolution alive which Apple introduced with the Late-2009 27" iMac. I'd be damed to buy the same resolution display again after 13 years. What's wrong with 2520p?
It's strange that people are so quick to say how bad pc/windows are (many times deservedly so) but if apple is doing something poorly, well its not their fault - seems crazy to me.
Because you still don't get that it's a physical limitation. For forward and backward compatibility of the entire software ecosystem, pixel doubling is the way to go when implementing a HiDPI user interface. And that means Retina displays also need to feature exactly twice as many physical pixels as before. macOS was never meant to run on low-res displays and doubling the number of pixels on a display that was already quite nice for 2009 pushes Apple far ahead of the competition. It's a luxury brand after all.

( Damn Porsches, don't drive as good with Toyota brakes! )
 
Last edited:

Kazgarth

macrumors 6502
Oct 18, 2020
318
834
Subpixel rendering is an expensive, cumbersome and complicated workaround, which was invented only because the display technology was immature. This workaround is not needed with modern high-resolution displays. If you want to use old equipment and old tech, macOS is not the best choice, nobody in their mind will deny the fact that Windows has better backwards hardware compatibility. But to portray Windows solution as superior just because it chooses to retain an overcomplicated and messy software stack to support cheap display panels is, well... an odd position to take.

It's neither expensive nor complicated. It existed since Windows XP era, ran on legacy hardware with almost zero performance impact, let alone modern CPUs & GPUs.

Plus you've ignored the fact that it already existed on MacOS, perfectly functional. Only was removed to make retina displays stand out.

And no, it's not only cheap display panels that depend on subpixel rendering, we have expensive 1440p gaming monitors with up to 240Hz refresh rate (impossible to run games on them if they had 5-6K resolution with current GPUs) same goes for Ultra-Wide monitors, and most 4K displays.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
We are talking about quite a significant software stack complication just to to support users who want to use hardware Apple doesn't really care about. I think their position on only offering minimal bare-bones support in this cases can be justified as well.
Really? Having an OS handle different resolutions in a manner that provides crisp text is too complicated for Apple?
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
We are talking about quite a significant software stack complication just to to support users who want to use hardware Apple doesn't really care about.
Subpixel antialiasing had been supported for 20 years. I'm not sure why Apple had to remove it when they were still selling and supporting Macs with non-retina displays.
How they announced the removal at a WWDC session was comical. It was long the lines "we removed that unnecessary feature that makes things blurrier".
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Subpixel rendering is an expensive, cumbersome and complicated workaround, which was invented only because the display technology was immature.

Quote from an ex-MacOS SWE:
Subpixel antialiasing is obnoxious to implement. It requires threading physical pixel geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain!
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
Really? Having an OS handle different resolutions in a manner that provides crisp text is too complicated for Apple?
It's not too complicated for Apple, it's physically impossible. For every logical pixel you want to draw there must be a physical pixel on the display. Or two pixels or three pixels, always a whole number multiplier. Only then you can switch entire pixels on and off. But if there are only 60% more pixels instead of twice as many, then one logical pixel is represented by 1.6 physical pixels and some grey tone must mark the border between black and white areas. Maybe you can't see it, but the fuzziness is definitely there and more apparent when you scroll. This problem is not solvable in software, but in hardware.

This whole debate reminds me of the 737 MAX debacle. The bigger engines don't fit under the wings anymore, but we really don't want to pay for a taller landing gear either. So let's just write a piece of software that pretends the trust is still coming from under the wings despite the engines are now in front of them. What could go wrong? Software can do everything, right! Even make broccoli soup taste like apple pie. And if it doesn't work, we blame the stupid programmers. Was it too complicated for you? I said apple pie, not plane crash!

PS: While we're at it, why ever buy a new display? Just demand that Apple writes an OS that lets the bondi-blue iMac 3G look like a Pro Display XDR. It can't be too complicated. Missing pixels are no excuse!
 
Last edited:
  • Like
  • Haha
Reactions: genexx and rmadsen3

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
Subpixel rendering is an expensive, cumbersome and complicated workaround, which was invented only because the display technology was immature. This workaround is not needed with modern high-resolution displays. If you want to use old equipment and old tech, macOS is not the best choice, nobody in their mind will deny the fact that Windows has better backwards hardware compatibility. But to portray Windows solution as superior just because it chooses to retain an overcomplicated and messy software stack to support cheap display panels is, well... an odd position to take.
I think what you wrote gets to the essence of the disagreement.

We agree on this: 4k 27" displays are now commodities, which makes them reasonably economically accessible (~$500) for most Mac users wanting an external display.

Here's where we fundamentally disagree:
Your position (IIUC): 4k 27" (163 ppi) is sufficiently dense to provide optimally sharp text without subpixel AA.
My position: 4k 27" (163 ppi) is not sufficiently dense to provide optimally sharp text without subpixel AA.

So while you're saying that the elimination of suppixel AA means Apple has merely abandoned support* for cheap low-DPI displays (quite reasonable), I'm saying it means they've abandoned support for anything other than Retina displays.

Hence, to get that "curated" (to use your term) viewing experience of optimum text shapness, I find MacOS now requires a Retina display, which puts large externals (>= 27") economically out of reach of many Mac customers, since that requires an ASD or LG Ultrafine.

*By "abandoned support" I don't mean they don't work, I mean they don't give an optimum viewing experience.
 
Last edited:

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
By the word "suck" in the name of this thread, we can conclude that Macs do not provide optimally sharp text on 163ppi displays. The question is who's to blame? The display maker/buyer for choosing a pixel density right in the middle of the "bad zone". Or Apple for not completely eliminating the effect by software? Anti-Aliasing whether on pixel or sub-pixel level is a blurring effect. It can only make blocky text look smoother not sharper.

Every cop show knows, you need to: "Zoom in ... Enhance!"

Screenshot 11.11.2022 um 13.41.09 PM.png
 
  • Like
Reactions: PauloSera

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Subpixel antialiasing had been supported for 20 years. I'm not sure why Apple had to remove it when they were still selling and supporting Macs with non-retina displays.

One word: transparency. They didn't have to remove it if they didn't bring so much transparency into Mojave when they added dark mode, but they made a choice between transparency without color artifacts, or no transparency, and they chose the former.

Subpixel antialiasing works because you know what's underneath when you draw the character, and you can do the blending right then and there. Once the character is drawn, the context is effectively lost. However, with the advent of Core Animation and GPU-accelerated drawing, something like a text field may no longer sit on a solid background, especially if that background is a "material" that blends with the other content on the desktop (such as the translucent backgrounds added in Mojave). Then it becomes impossible to know how to do the subpixel blend when drawing the character, as it is only known at final compositing time, where you no longer have the context required to blend it properly.

One project I worked on had to selectively disable the subpixel antialiasing depending on exactly what was happening with text. If we detected that the background was solid fill, we enabled subpixel AA. But if the background was clear/transparent, which meant a clear background CALayer, we had to disable it, or it'd look even worse with color fringing where the subpixel AA assumed the wrong background color for the pixel. Something I also encountered playing with Core Animation when it first came out.

This is the same issue Apple was facing. In Mojave, if you enabled subpixel AA and turned dark mode on, you could see the color fringing from the text. In light mode it wasn't really as apparent since the AA was assuming a white background which is generally "close enough" even with some light translucency.

By the word "suck" in the name of this thread, we can conclude that Macs do not provide optimally sharp text on 163ppi displays. The question is who's to blame? The display maker/buyer for choosing a pixel density right in the middle of the "bad zone". Or Apple for not completely eliminating the effect by software? Anti-Aliasing whether on pixel or sub-pixel level is a blurring effect. It can only make blocky text look smoother not sharper.

Generally, Windows can do sharper text at the same "scale factor" compared to macOS. But a good chunk of that is that Windows achieved resolution independence (in an opt-in way for apps, and they'd look pretty crusty if they don't opt-in and handle it themselves), while Apple did not. Apple instead has to downscale any non-integer "scale factor" from the next integer up, which adds a full screen AA effect that cannot do the sort of things that font AA (subpixel or not) cannot do. And it affects more than just text as a result.

Honestly the biggest engineering short-cut that Apple has made, which has also bit them in the butt as Apple went an entirely different direction than the rest of the display market. That said, it was a lot faster for developers to hop on board.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Here's where we fundamentally disagree:
Your position (IIUC): 4k 27" (163 ppi) is sufficiently dense to provide optimally sharp text without subpixel AA.
My position: 4k 27" (163 ppi) is not sufficiently dense to provide optimally sharp text without subpixel AA.

I have to admit that I never understood this claim. At the typical desktop viewing distances of 80-100cm a 32" 16:9 4K has comparable angular pixel density than the original retina 15" at the typical laptop viewing distances of 50-60cm. I don't think that close to my desk monitor and I don't have any problem with text aliasing on my 32" 4K monitor. I understand that someone who sits closer is likely to notice artifacting, but why get a large display and then stick your nose in it?

Subpixel antialiasing had been supported for 20 years. I'm not sure why Apple had to remove it when they were still selling and supporting Macs with non-retina displays.
How they announced the removal at a WWDC session was comical. It was long the lines "we removed that unnecessary feature that makes things blurrier".

Because it simplifies the software rendering stack. With sub pixel AA text is the "special case", you have to tiptoe around it and chose your rendering method depending on the context. In particular you need to take into account the background color and employ custom algorithms for tweaking the channel luminance. This gets especially messy with compositing desktop managers (you need to know which other layers your text layer will be composed with). And of course, subpixel AA depends on the pixel grid orientation.

Why maintain all this mess if you simply don't need the feature on modern hardware?

It's neither expensive nor complicated. It existed since Windows XP era, ran on legacy hardware with almost zero performance impact, let alone modern CPUs & GPUs.

And before that we had hardware T&L, accelerated 2D blitters, bitmap fonts, pixel palettes and banked RGB planes... should we keep using those hacks as well? Technology moves forward and makes old hacks obsolete.
 
  • Like
Reactions: Tagbert

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Generally, Windows can do sharper text at the same "scale factor" compared to macOS.

It's one of the main reasons for "sharp text" on Windows is that it doesn't respect the font metrics and instead snaps the shapes to the pixel grid? Or, to put differently, it doesn't render the fonts correctly to make the letters appear sharper? After being spoiled by macOS accurate text rendering I find Windows fonts absolutely horrendous.
 
  • Wow
Reactions: Gudi

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
After being spoiled by macOS accurate text rendering I find Windows fonts absolutely horrendous.
That's the funny thing, I'm also absolutely spoiled by MacOS's accurate text rendering when using my MBP - until its hooked up to my monitor. You may find Windows font rendering to be horrendous, but for many people like myself, its the polar opposite.

I can be black and white at times, and here's where that mentality fits. It is really a binary discussion, it either looks better or it doesn't. We don't need to go into sub-pixel anti-aliasing discussions, no need to say that windows cheats or does things against conventions. Its really what looks better.

Does my cheap dell work laptop look great on my monitor? Yes
Does my expensive MBP look great on my monitor? No

Based on the answer above - Who is doing a better job with text rendering on monitors? PCs/Windows (if it was Apple, this thread and many other threads, articles, and discussions wouldn't exist on this topic)

The proof is in the pudding, 1080p, and 1440p is still mainstream, and saying its too complicated for apple that they removed the feature seems insulting to apple's ability to make amazing products.
 
Last edited:

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
The proof is in the pudding, 1080p, and 1440p is still mainstream, and saying its too complicated for apple that they removed the feature seems insulting to apple's ability to make amazing products.
Mainstream are Toyota Corolla brakes, with whom the Porsche looses all his handling. Are we talking the best car, the cheapest car or the most frequent one? Because that's three different things.

Eat **** — fifty billion flies can't be wrong.
 
Last edited by a moderator:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.