Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

theorist9

macrumors 68040
May 28, 2015
3,710
2,812
It's the question how resolution independence is achieved. In the traditional rendering model there are just pixels — you get your drawing surface, it has certain pixel dimensions, you want to draw a 1 pixel thick line ,and so you do it. If you want to have resolution independence things get more complicated, as you need to distinguish between the logical coordinates/sizes (how "large" stuff looks) vs. it's actual size in physical (hardware) pixels on screen. One obvious approach is to say that a single "logical" pixel corresponds to "x" hardware pixels, where x is some real number. That's essentially what Windows does. Apple was actually one of the first to experiment with this technology, they had a working implementation of fractional backing scale factors somewhere around Lion, I forget. The problem though that this kind of fractional mapping makes rendering complicated. Especially applications that do their own rendering become broken and might need to have parts of them redesigned. There are also non-trivial performance implications. It's simple enough to code a line drawing algorithm that draws a line of fixed thickness. It's not that simple to code a line drawing algorithm that draws a line of arbitrary thickness (which you need to support arbitrary backing scale factors).

So Apple took a different approach. They fix their backing scale factors to be integer (2x or 3x currently), which simplifies algorithms and software transition, while still allowing them to emulate arbitrary scaling (vie downsampling). This decision was the main reason why most Mac software could transition to retina graphics as quickly as it did. This approach also result in a superb rendering quality most of the time, since you are essentially using super-sampling AA for scaled resolutions. The are drawbacks of course, memory usage for render surfaces and memory bandwidth being main concerns. As to performance, I don't think anyone did detailed measurements. It is true that rendering to a higher-resolution target is generally slower, but it's not obvious that is slower than implementing backing scale factor agnostic rendering algorithms everywhere (in practice it might even end up being faster). The performance overhead of downsampling tends to be massively exaggerated: GPUs are very good at linear filtering and those resolutions and frame rates are not even close to being a challenge, not to mention that you only need to process the small areas of the screen that have changed between two frames. Bandwidth concerns are real though, which is what we see with M1 and multiple high-res displays trying to do GPU-intensive work. But then again, it's not like any comparable GPU would fare any better in a similar scenario, retina rendering or not...
Thanks for the detailed explanation, but I'm afraid it doesn't directly answer either of my questions. When you don't have the app-specific issues you described, does Windows' resolution-independence mean it avoids the sharpness loss Apple has with non-integer scaling? And can you change the UI size with Windows independent of the scaling factor?
 

leman

macrumors Core
Oct 14, 2008
19,302
19,285
Thanks for the detailed explanation, but I'm afraid it doesn't directly answer either of my questions. When you don't have the app-specific issues you described, does Windows' resolution-independence mean it avoids the sharpness loss Apple has with non-integer scaling? And can you change the UI size with Windows independent of the scaling factor?

I don’t have intimate knowledge about Windows UI subsystems, so I’m afraid I can’t answer this. Just a few thoughts for your consideration:

- rendering without downsampling can always be made to appear “sharper”, since you can “snap” the visual feature boundary to physical pixels, but if you do this you are no longer producing mathematically correct rendering. Windows in particular was known to use this in the past to enhance the sharpness of the fonts, no idea whether it still does. But it was a major reason why many designers preferred the Mac - for the ability to render fonts how they were actually supposed to look

- none of the two approaches have any inherent “sharpness loss” to them. For vector graphics it doesn’t matter if you render at 2x2 and then downsample to 1.5x1.5 or directly render to 1.5x1.5 as long as you go your rendering correctly. Rendering directly can appear sharper if you are ok with not rendering correctly (see above). For raster(pixel) data both approaches have the same problem: they have to resample the image anyway. In practice, Apples approach will always downsample a higher-res image asset, which offers better quality than upsampling a lower-res one. But there is nothing preventing Windows from doing the same provided they have the appropriate infrastructure for asset selection in place.

- settings to change the UI size rarely works since it can break the size invariants designers work with. For this to work you need some sort of adaptive APIs that let the application choose different designs at different scales. It’s been a while since I looked at modern UI programming for Windows, but I wouldn’t be surprised if they offer adaptive APIs
 

theluggage

macrumors 604
Jul 29, 2011
7,548
7,469
does Windows' resolution-independence mean it avoids the sharpness loss Apple has with non-integer scaling?
For any graphics rendered at run time by the OS or well-behaved applications, yes. Anything pre-rendered (icons, images etc.) will still have to be non-integer scaled, and anything rendered by ill-behaved software (that doesn't call the correct API to translate between logical coordinates and pixels) could get degraded or broken. E.g. it's common to render to an off-screen buffer and then plot that - well-behaved applications will call the OS to create a buffer and graphics context that matches the screen, lazy/crossplatform apps may have that resolution hard-coded in.

Overall, I'd say that the Windows solution is theoretically superior, but the Mac solution is more robust, esp. when it comes to multiple, mixed resolution screens or hot plugging external displays on a laptop. Windows has had the APIs to let you write resolution-independent code & let the user choose the scale factor/PPI since forever - but in the past changing the PPI mostly broke stuff. It's been a few years since I regularly used a Windows laptop - which had a "retina" screen and worked well at 150% scale but tended to screw up when you docked it to a standard def externaj display.

- rendering without downsampling can always be made to appear “sharper”, since you can “snap” the visual feature boundary to physical pixels, but if you do this you are no longer producing mathematically correct rendering. Windows in particular was known to use this in the past to enhance the sharpness of the fonts, no idea whether it still does.
There's nothing "incorrect" about this. "Outline" font formats describe the fonts using resolution-independent mathematical curves - and usually include "hinting" rules designed by the typographer to adjust the exact shape at render-time to maintain legibility at low resolutions, stop them getting too 'black' at large sizes etc. I think the concept was invented by Donald Knuth with Metafont - but it was really Apple and Adobe who popularised it (along with pretty much creating the desktop publishing industry) and created the TrueType format (Adobe and MS later expanded it into the current OpenType format). I think earlier versions of Windows (and pre-TrueType Macs) relied on pre-rendered bitmap "screen" fonts to approximate the true outline fonts that could be produced by PostScript printers/typesetters.
 

leman

macrumors Core
Oct 14, 2008
19,302
19,285
There's nothing "incorrect" about this. "Outline" font formats describe the fonts using resolution-independent mathematical curves - and usually include "hinting" rules designed by the typographer to adjust the exact shape at render-time to maintain legibility at low resolutions, stop them getting too 'black' at large sizes etc. I think the concept was invented by Donald Knuth with Metafont - but it was really Apple and Adobe who popularised it (along with pretty much creating the desktop publishing industry) and created the TrueType format (Adobe and MS later expanded it into the current OpenType format). I think earlier versions of Windows (and pre-TrueType Macs) relied on pre-rendered bitmap "screen" fonts to approximate the true outline fonts that could be produced by PostScript printers/typesetters.

I was not talking about font rendering exclusively, but about all kind of vector rendering where pixel snapping can be used. Sure, some fonts are designed with pixel snapping in mind. But it hardly makes it better IMO? Consider this (old but still very interesting) article:

https://damieng.com/blog/2007/06/13/font-rendering-philosophies-of-windows-and-mac-os-x/

If what we see there is just a correct application of font hinting, fine, but it still makes font behave inconsistently at different sizes. The screenshots in the linked article should illustrate how bad it used to be. I mean, even if these hints are provided by the font author, it's still pretty much awful. So what if it's sharper, it's also uglier. Just look at the outlines of "S" and "p" on both screenshots, there is a notable difference — the glyphs on the right are elegant and smooth, with ink-like quality, while the left ones are a sad mess.

As I said, I don't know if ClearType is still this aggressive about manipulating glyphs. From what I understand Apple ignores hinting altogether in order to give the text consistent print-like appearance, even it can introduce additional blurriness at lower DPI.
 

theluggage

macrumors 604
Jul 29, 2011
7,548
7,469
Well I'll will concede that its too complicated for apple to justify the continued support.
I think that's the crux of it. Sub pixel anti-aliasing is fragile. Apart from the issues with transparency & not working with fractional scaling already noted, it relies on the OS (a) knowing the physical layout of the RGB subpixels and (b) being able to control them individually without interference. It could be broken by:
  • Different subpixel layouts (increasingly likely on OLED displays and wide-gamut panels)
  • Image enhancement by "smart" displays
  • Displays running in Ycbcr instead of RGB
  • Compressed streaming over something like DisplayLink or AirPlay
  • ...and I don't know but I bet you an Internet that it doesn't work with display stream compression on DisplayPort 1.4+
It's a liability that, even when it works, brings diminishing returns once you get to Retina-class displays. Apple can only guarantee that it will work on Apple displays - otherwise "do you feel lucky?" - and Apple no longer makes any non-retina displays.

So, yes, it would be nice if Apple left it in as an option with an "if it breaks you get to keep both halves" disclaimer, but we all know that's not Apple's style.

The more general root issue behind this thread, though, is:

All display manufacturers test their own displays on Windows - testing them on Mac is left to Apple and the Mac community. Maybe a display has a "quirk" which Windows tolerates but MacOS doesn't. Maybe Apple have implemented the standard wrong. Maybe Windows has implemented the standard wrong and the work-around breaks the product on the Mac. The only solution is to check reviews (proper reviews, not gushy YouTube unboxings) and forums like this before buying.

This Youtube Video here also talks about this Issue, and how that Dude went from a 4k Display back to a 2560x1440 Display:
Yes, they do need a 2560x1440 display - not "because scaling" but because their regular M1 is struggling to run Blender at 1:1 4k, let alone 5k (which is what is happening internally in scaled mode). If they'd gone 5k instead to avoid scaling, they'd probably still have problems (although Blender support for Apple Silicon may have improved since then).

It's particularly ironic that they chose Blender as an example, since Blender renders it's own, fully scalable, resolution-independent UI, so you can just run it in 1:1 or 2:1 mode, avoiding all of the fractional scaling issues, and adjust the UI scale to taste. Unfortunately, they're citing that ruddy Bjango.com article which presumes that you have to run a 4k monitor in fractional scaling mode. (It's not so much the article itself, but the way it is prone to cherry picking - the simulated "examples" illustrate and explain scaling artefacts quite well, but they greatly exaggerate what you'd actually notice in reality - and anybody using fine hatching or checkerboards on digital media is holding it wrong).

Actually, that's what I've been trying to explain is not a limitation with pixel-doubled 4k @ 27" (163 ppi)--all you need to do is adjust your default font sizes (or app zoom settings) so the text is the same absolute size as on a pixel-doubled (i.e., default) 5k@27", and you can fit nearly the same content.

Hallelujah! Somebody gets it! Lots of people here are basing the "only 220ppi will do" dogma on a couple of articles that totally ignore the possibility of using 1:1 or 2:1 screen mode on a 4k screen, and that most applications allow you to freely scale the actual content.

Shock news: yes, 220ppi displays are optimum for MacOS (if your entry-level Mac can push that many pixels in your favourite app), and if all else were equal we'd all be rocking dual Pro XDR displays, even if we had to prop them up on a pile of bricks rather than buy the stands. 4k is a compromise, but it's a compromise that gets you a dual-display setup for half the price of a single Studio Display with the decent stand.

...and if your bionic eyes can notice artefacts on a 27" display in "scaled 1440p" mode then they'll probably be able to read the system font on a 27" display in 3840x2160 mode. Let alone a 30" or larger 4k display.
 

eltoslightfoot

macrumors 68020
Feb 25, 2011
2,277
2,716
I have an LG 32UN650-W 4k 32" monitor which works fantastic at 100% display scaling. I also have a Dell 27" Gaming Monitor S2721DGF 1440p 165Hz. This also works fantastic - seemingly lacking the fuzziness issues with 1440p in the past. Is it as sharp as my 4k? Of course not--neither is it with windows. But it works fine when I need it.

Anyway, I think the M1s seem to work far better with external monitors than the intel Macs did, but YMMV of course based on what you have or buy in terms of monitors.
 

theluggage

macrumors 604
Jul 29, 2011
7,548
7,469
I mean, even if these hints are provided by the font author, it's still pretty much awful. So what if it's sharper, it's also uglier.
You don't have to argue too hard to convince me that MacOS ~2007 rendered fonts better than Windows ~2007. The point is that, in typography, different font sizes aren't always the result of linear scaling.

It's very hard to judge from an image without knowing how it was prepared, what software was used etc. Cleartype/sub-pixel anti-aliasing is only supposed to be viewed on the device it was rendered for. If those are screenshots from Wordpad vs TextEdit I'd be totally unsurprised - if they're exported bitmaps from the same Illustrator file on both platforms I'd be worried. There are clearly differences in kerning and leading - which you can adjust in a decent drawing/DTP package. In a DTP package, designing for print, the important thing would be which got the character widths, line lengths and spacing "right".
 
  • Like
Reactions: eltoslightfoot

Feek

macrumors 65816
Nov 9, 2009
1,344
1,978
JO01
I've not read the entire thread but I'd like to question the title (and apologies if this has been picked up on). M1, and I assume M2 Macs work perfectly well with the Apple Studio Display. I've got a couple connected to my MSU and one connected to my M1 MBA and the display is fantastic. No issues with scaling, they just look superb.
 
  • Like
Reactions: eltoslightfoot

theorist9

macrumors 68040
May 28, 2015
3,710
2,812
I've not read the entire thread but I'd like to question the title (and apologies if this has been picked up on). M1, and I assume M2 Macs work perfectly well with the Apple Studio Display. I've got a couple connected to my MSU and one connected to my M1 MBA and the display is fantastic. No issues with scaling, they just look superb.
The thead is specifically about the performance of non-Apple monitors with the Mac. The title was sloppy, but I think you can tell what the thread is about from reading the first post.
 
  • Like
Reactions: Feek

theorist9

macrumors 68040
May 28, 2015
3,710
2,812
Hallelujah! Somebody gets it! Lots of people here are basing the "only 220ppi will do" dogma on a couple of articles that totally ignore the possibility of using 1:1 or 2:1 screen mode on a 4k screen, and that most applications allow you to freely scale the actual content.
I'd like to say Hallelujah as well! No one with whom I've argued this has been willing to concede the point, even after I've provided those Mathematica screenshots, which is very frustrating, so that acknowledgement is much appreciated!
 

theorist9

macrumors 68040
May 28, 2015
3,710
2,812
none of the two approaches have any inherent “sharpness loss” to them. For vector graphics it doesn’t matter if you render at 2x2 and then downsample to 1.5x1.5 or directly render to 1.5x1.5 as long as you go your rendering correctly.
When it comes to MacOS, I don't believe that first sentence is correct. Fonts in MS Word, for instance, appear to be vectorized (you can increase the zoom as much as you'd like, an you don't get pixelization), yet are less sharp with non-integer scaling in MacOS.
So what if it's sharper, it's also uglier. Just look at the outlines of "S" and "p" on both screenshots, there is a notable difference — the glyphs on the right are elegant and smooth, with ink-like quality, while the left ones are a sad mess.

As I said, I don't know if ClearType is still this aggressive about manipulating glyphs. From what I understand Apple ignores hinting altogether in order to give the text consistent print-like appearance, even it can introduce additional blurriness at lower DPI.
When you don't have a Retina display, this really is a case of pick your poison, and people are passionate on both sides.

Unless you're a graphic artist who needs accurate glyphs for their work, there are no absolutes —it's purely personal preference. A lot of that may come down to variations in vision, and how sensitive people are to blurred text. My own view is the opposite of yours: So what if it's prettier, it's also blurrier. I'd far prefer less aesthetically pleasing fonts, if it saves me from eye fatigue. And, on low-DPI displays, I don't find either MacOS or Windows fonts ink-like. The Windows fonts because they're jaggy, and the Mac fonts because they're blurry. Good-quality printed text is neither.
 
Last edited:

andreasfcb

macrumors newbie
Oct 27, 2012
27
12
From 2017 to 2022 I had been using a 27'' iMac 5K with two additional 27'' 4K displays. I did not have any issues with this setup, but only one scaling option really works well: You need to scale the 5K with the factor of 2x and the 4K with the factor of 1.5x, otherwise you will not find a good matching ratio. Fortunately, this scaling factor is just perfect for working.

It is true that 4K screens might look slightly less sharp than possible due to the bad MacOS scaling. But as a perfectionist, developer and photographer I did not have any complaints there. Even though I had the 5K iMac right next to my 4K screens, I did not notice the difference as much as some people claim in this thread. The difference to FullHD or 2K is way larger.

Now I am using a 14'' MBP and triple 27'' 4K displays. No complaints there either using the same scaling as mentioned above (2560*1440, 1.5x).
 

theorist9

macrumors 68040
May 28, 2015
3,710
2,812
From 2017 to 2022 I had been using a 27'' iMac 5K with two additional 27'' 4K displays. I did not have any issues with this setup, but only one scaling option really works well: You need to scale the 5K with the factor of 2x and the 4K with the factor of 1.5x, otherwise you will not find a good matching ratio. Fortunately, this scaling factor is just perfect for working.

It is true that 4K screens might look slightly less sharp than possible due to the bad MacOS scaling. But as a perfectionist, developer and photographer I did not have any complaints there. Even though I had the 5K iMac right next to my 4K screens, I did not notice the difference as much as some people claim in this thread. The difference to FullHD or 2K is way larger.

Now I am using a 14'' MBP and triple 27'' 4K displays. No complaints there either using the same scaling as mentioned above (2560*1440, 1.5x).
I have a 4k 27" next to my 5k 27" iMac. I use the 4k with 2:1 scaling, adjust magnification and/or font size as needed, and find that works quite well. But even with that, I do notice the difference in sharpness, so I think it varies with the person.
 
  • Like
Reactions: AAPLGeek

andreasfcb

macrumors newbie
Oct 27, 2012
27
12
I have a 4k 27" next to my 5k 27" iMac. I use the 4k with 2:1 scaling, adjust magnification and/or font size as needed, and find that works quite well. But even with that, I do notice the difference in sharpness, so I think it varies with the person.

I agree that you will notice it, but it depends on your distance and on what you do (obviously). I guess it will also depend on the scaling ratio. :)
 

giovincci7

macrumors newbie
Apr 24, 2024
1
0
I know this is an old post. I'm a first time Mac user, and I am now facing external display issues. I have an LG 4K display. I have been using it for more than 5 years with Windows. I used it my ThinkPad P51, and am now using it with a desktop that I built myself. The ThinkPad is Intel-based, and the PC I built is AMD-based. I've never had any issues with my monitor. I just bought an MB Pro M3 Pro 14 inch. I have been having issues whenever I connect it to my exernal monitor. The monitor would show vertical and horizontal lines, althought not at the same time. The vertical lines came first before the horizontal lines. Prior to the lines showing, the monitor became unresponsive and it appeared as if there was burn-in. Weirdly enough, the fix for it is to connect the monitor back to my Windows machine. The unresponsiveness and the lines disappeared after being connected to my Windows desktop for a few minutes. The first time I had the issue, I thought my monitor was ready to conk out on me. Although I have somehow come across a fix for the external monitor issues, I am now wary each time I connect my MB to my external monitor. I have began to shop for a replacement monitor. I wouldn't mind suggestions. I am also looking at the rtings site before getting a new monitor. My question to those who have more experience using macs with external monitors, is there really an external monitor that's not an apple branded monitor that works with a mac with no issues?
 

pshufd

macrumors G3
Oct 24, 2013
9,967
14,446
New Hampshire
I know this is an old post. I'm a first time Mac user, and I am now facing external display issues. I have an LG 4K display. I have been using it for more than 5 years with Windows. I used it my ThinkPad P51, and am now using it with a desktop that I built myself. The ThinkPad is Intel-based, and the PC I built is AMD-based. I've never had any issues with my monitor. I just bought an MB Pro M3 Pro 14 inch. I have been having issues whenever I connect it to my exernal monitor. The monitor would show vertical and horizontal lines, althought not at the same time. The vertical lines came first before the horizontal lines. Prior to the lines showing, the monitor became unresponsive and it appeared as if there was burn-in. Weirdly enough, the fix for it is to connect the monitor back to my Windows machine. The unresponsiveness and the lines disappeared after being connected to my Windows desktop for a few minutes. The first time I had the issue, I thought my monitor was ready to conk out on me. Although I have somehow come across a fix for the external monitor issues, I am now wary each time I connect my MB to my external monitor. I have began to shop for a replacement monitor. I wouldn't mind suggestions. I am also looking at the rtings site before getting a new monitor. My question to those who have more experience using macs with external monitors, is there really an external monitor that's not an apple branded monitor that works with a mac with no issues?

I use Dell Ultrasharp U2718Q and U2720Q monitors on my Studio (formerly mini and M1 Pro MacBook Pro) and haven't had problems since early 2022. My monitor issues were solved with macOS updates and better cables.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.