Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...but you wouldn't do that on Windows. You'd set the resolution to 3840x2160 and the scaling to 120, 150 or 175% to adjust the UI size to taste. MacOS doesn't support the variable scaling option: you've got 100% (1:1) and 200% (2:1) baked in to (for a 4k display) "looks like 3840x2160" and "looks like 1920x1080" respectively, and anything you want in between is provided by rendering at to some other resolution internally and downsampling the result to 4k.

Windows has worked that way since the early days (you used to choose PPI rather than % but it works the same way) whereas Macs - possibly thanks to their DTP heritage - have always worked on a fixed notional PPI (1 pixel = 1 point in the early days).

I think the Windows method is better in theory, but it is very much dependent on applications implementing it properly, and you might still get fractional scaling on the application level: for starters developers can't include bitmap assets for every possible scale - and for 'vector' rendering might choose to render into a fixed-resolution buffer and then scale that to the screen. Mac applications just have to cater for 1:1 and 2:1 modes, two sizes of bitmap/icon, will "fail safe" if they only support 1:1.

It's worth noting that MacOS fractional scaling is very similar to "full screen anti-aliasing" that is used to make games look *better*.

there are some problems here:
1) Problems with multiple displays setting
2) Problem with Legacy Apps or other app an exemple intel hd integrated graphics, or some web apps..
3) Problems with the taskbar or UI
4) problem with gpu that not support 4k 60 fps, but only 4k 30 fps

In conclusion: macOS offers both types of scaling, while Windows offers only one type of scaling with the zoom option, which creates other issues.

Returning to the initial discussion: 4K on macOS looks great and there are no issues with apps or multiple displays. 5K is better, but 4K offers a better viewing experience than 1080p and 1440p at a much lower price compared to 5K.
 
Obviously 4K are still very widely the standard hidpi monitor resolutions, and yet there is so much discussion about 4K vs 5k.

I’ve got a 27” 4K and I think it… looks fine? It’s running at “1440p” or whatever you’d call it.

As I understand it, Mac likes the half the resolution and 4K isn’t 2x1440p, so it actually mimics 5k internally then scales that to the 1440p

And back to my original point… everything seems to work/look fine? So why is everyone acting like 4K is such a problem?

I’m potentially considering a 32” monitor, so this is mainly why it’s also popping up in my mind. Add in that I might consider a different screen ratio… 16:9, 16:10, or ultra wide…
It isn't. You can still have a HiDPI resolution with a 4k monitor, and I have found it is the sweet spot between price, functionality, and still have sharp, perfect text.

I mean, if I could get a 5k monitor for like $400, then maybe, but they are all hideously expensive.
 
...but you wouldn't do that on Windows. You'd set the resolution to 3840x2160 and the scaling to 120, 150 or 175% to adjust the UI size to taste. MacOS doesn't support the variable scaling option: you've got 100% (1:1) and 200% (2:1) baked in to (for a 4k display) "looks like 3840x2160" and "looks like 1920x1080" respectively, and anything you want in between is provided by rendering at to some other resolution internally and downsampling the result to 4k.

Windows has worked that way since the early days (you used to choose PPI rather than % but it works the same way) whereas Macs - possibly thanks to their DTP heritage - have always worked on a fixed notional PPI (1 pixel = 1 point in the early days).

I think the Windows method is better in theory, but it is very much dependent on applications implementing it properly, and you might still get fractional scaling on the application level: for starters developers can't include bitmap assets for every possible scale - and for 'vector' rendering might choose to render into a fixed-resolution buffer and then scale that to the screen. Mac applications just have to cater for 1:1 and 2:1 modes, two sizes of bitmap/icon, will "fail safe" if they only support 1:1.

It's worth noting that MacOS fractional scaling is very similar to "full screen anti-aliasing" that is used to make games look *better*.
Yeah I mean I use my 27" 4k gaming monitor with both Mac and Windows. To me, it looks better in Mac, but Windows has been catching up--until (exactly as you say) you hit some app that hasn't implemented the zoom properly and everything is VERY TINY.

Both approaches have their strengths and weaknesses.
 
DSCF1244.JPG
DSCF1245.JPG


Here's an extremely unscientific test. I used my Fuji XT-5 with 30mm Macro lens to take two photos of this Macrumors thread.

The photo above is from my Dell U4025QW, and the bottom is from my M4 MBP.

There's more moire in the Dell photo for whatever reason.

Obviously the MBP looks "better", but the Dell still looks pretty damn good, and this is magnified.

Back when HiDPI displays came out, Apple was one of the first to include them. macOS was the best at handling the scaling.

Windows has surpassed macOS in it's handling of HiDPI displays since (it was horrible at first), I think it's the best of any OS.

Linux is atrocious. Particularly in X11, just freaking horrific. Wayland is an improvement, but I can't even use it because of other incompatibility problems (work machine, can't make the changes I need to, blah blah blah).
 
Uh using it now actually. MSI Afterburner is something I use daily that does not scale. So do some legacy applications and open source software. Its horribly difficult to use these applications since everything is so small.
Ok, you found one of the 5%. Dumb programs with their own UI toolkit are one of the worst offenders. Can't say the same about open source software. Plenty of properly big name opensource working stuff like Gimp, LibreOffice, Krita, Calibre, etc. are all properly scaled.

FYI Afterburner stupidly can scale it's UI but you have to configure it in it's own settings.
 
Yeah I mean I use my 27" 4k gaming monitor with both Mac and Windows.

Both approaches have their strengths and weaknesses.
I use the same 27" 4k with Windows laptop and Mac. I like the simplicity of the Mac approach. If a friend or relative were getting a new Mac mini and asked for a third party monitor recommendation (because ASD is not in their budget), it would be very easy these days. Their setup would be plug and play, and if they looked in Settings > Displays, the default is very easy to understand within the depiction of Larger Text ... More Space.
 
  • Love
Reactions: eltoslightfoot
I use the same 27" 4k with Windows laptop and Mac. I like the simplicity of the Mac approach. If a friend or relative were getting a new Mac mini and asked for a third party monitor recommendation (because ASD is not in their budget), it would be very easy these days. Their setup would be plug and play, and if they looked in Settings > Displays, the default is very easy to understand within the depiction of Larger Text ... More Space.
Totally agree, and then for the "real ones" like us, you can switch it to actual resolutions. :D
 
  • Like
Reactions: CharlesShaw
Ok, you found one of the 5%. Dumb programs with their own UI toolkit are one of the worst offenders. Can't say the same about open source software. Plenty of properly big name opensource working stuff like Gimp, LibreOffice, Krita, Calibre, etc. are all properly scaled.

FYI Afterburner stupidly can scale it's UI but you have to configure it in it's own settings.
I have about a dozen apps in Windows that suffer from this. Some also scale very poorly making the buttons small but button text extremely large. Fact is this is where macOS shines where you don't have such insane flexibility where you can essentially kill an form of UI consistency in Windows. This is why I find programs on macOS much more visually better than on Windows. Filezilla vs Transmit for example.
 
  • Like
Reactions: CharlesShaw
That's very interesting. Other 5K 27" displays I've looked into and recall (e.g.: the Asus ProArt) were listed as matte. Do you know of any non-Apple 27" 5K glossy displays?

Richard.
The only non-Apple glossy 5k 27" I know of is the LG UltraFine:

I've not used it myself, so I don't know if its AR coating is as good as Apple's. It is a much older design, and users have reported QC issues. But perhaps they've finally fixed those. Prior to the ASD, Apple carried the UltraFine on their own website.

If I needed a 5k display and my budget were <$1k, I'd get a used ASD rather than a new LG. I recall seeing one on Craig's List last year for $800 with AppleCare+.

I'm disappointed all the newer Retina-class (220 ppi) displays from Samsung, Dell, and Asus are matte. A lot of Mac users are used to and enjoy 5k 27" glossy screens (because of the iMac's long history), so I'd think if they offered a glossy version, that would expand their market. Then again, they'd need to make the AR coating as good as that on Apple's glossy displays.
 
Last edited:
  • Like
Reactions: drrich2
Posts about this topic often describe it as an objective science. A monitor with this many pixels is objectively worse than one with that many. This is all true, but actually, the most useful thing is for an individual to do some A/B testing. Because an individual may not be able to discern a difference or even a preference between displays.
I run an LG 4K 27” 60 Hz and I cannot see any pixels at my viewing distance. I dont know what scaling I use. I just choose the one that i prefer and suits me best from System Settings.
I have played with the 5K ASD in-store and it looks better, but not enough for me to care. My eyes aren’t good enough to care about that difference.
But put me in front of a Mac with a 2K monitor and it doesn’t matter what I do with scaling. I can’t stand using it.
These things are all very personal.
 
Last edited:
Posts about this topic often describe it as an objective science. This many pixels is objectively worse than that many. This is all true, but actually, the most useful thing is for an individual to do some A/B testing. Because an individual may not be able to discern a difference or even a preference between displays.
I run an LG 4K 27” 60 Hz and I cannot see any pixels at my viewing distance. I dont know what scaling I use. I just choose the one that i prefer and suits me best from System Settings.
I have played with the 5K ASD in-store and it looks better, but not enough for me to care. My eyes aren’t good enough to care about that difference.
But put me in front of a Mac with a 2K monitor and it doesn’t matter what I do with scaling. I can’t stand using it.
These things are all very personal.
Exactly right. When I posted my findings here ( https://forums.macrumors.com/thread...rs.2445661/page-2?post=33635386#post-33635386 ), I was explicit in saying these are for me.

Those who say everyone needs a 5k monitor to have a pleasing viewing experience are flat-out wrong.

And those who say no one needs a 5k monitor to have a pleasing viewing experience likewise are flat-out wrong.

It depends on how good your eyesight is—some may not even be capable of detecting the difference—and, for those who can see the differrence, it comes down to how much tolerance you have for blur.

It also depends on what kind of work you're doing. It's harder to see the difference if you're working with, say, video rather than text.
 
Exactly right. When I posted my findings here ( https://forums.macrumors.com/thread...rs.2445661/page-2?post=33635386#post-33635386 ), I was explicit in saying these are for me.

Those who say everyone needs a 5k monitor to have a pleasing viewing experience are flat-out wrong.

And those who say no one needs a 5k monitor to have a pleasing viewing experience likewise are flat-out wrong.

It depends on how good your eyesight is—some may not even be capable of detecting the difference—and, for those who can see the differrence, it comes down to how much tolerance you have for blur.

It also depends on what kind of work you're doing. It's harder to see the difference if you're working with, say, video rather than text.
I agree with all that. There are other considerations, too. I have a friend who probably has worse eyesight than mine, but values build quality and the premium appearance that the monitor on his desk demonstrates to customers. He has two ASDs.

The other thing I’ll add is that it wouldn’t matter how crisp the text or how cheap the price, I wouldn’t get a Samsung S9 Viewfinity with all of its software bloat, reliance on IR remote etc.
 
Last edited:
4k is terrible on 27+ inch screens and that is after using 5K displays since 2022. They are only meant for monitors under 24 inches. In fact, my main setup which my M1 Pro MacBook Pro now M4 Max hook up to a Thunderbolt dock to 2 Studio Displays (VESA version) with two stand alone monitor mounts. Yes, it makes it looks like a generic computer monitor, but it does the job as I need pixel perfect for developing apps. Also, the fact that 4K at 1440P does not map properly as 4K does not have enough pixels and 1080P scaling is way too big.

I bought a 4K monitor for my father that has a M1 Mac mini since he probably doesn't care and only use it for web browsing. I tried 1440p scaling and it doesn't look good. Also, 1440p is not acceptable anymore with recent macOS versions as Apple got rid of text antialiasing. While the text does not look blurry, it looks pretty bad without the anti-aliasing than a 5K. While a matte 5K monitor like the Samsung and Asus one won't be nearly as sharp as a Studio or LG Ultraphine since it's not glossy, it does look better than 1440p and 4K at 1440p scaling. I have a Viewfinity S9 I got for cheap as a secondary display to replace the 1440p and it's better, but not as good as the Studio Display.

If you can afford it, the Studio Display is a good long term monitor as they will last a long time. I don't do competitive gaming, so I don't need high refresh rates. Even the matte 5K will be better than 4K on macOS any day. 4K depending on the user, is not good for 1440p and it will never be since it does not have the DPI. If you can live with 1080p scaling with everything looking too big, that is fine.

At least macOS scaling approach is better than Windows. Windows can be a hit or miss as the apps can support it, or not, and look too small or have blurry text. What infuriates me is the app having their own scaling setting, meaning on the initial launch, it will look really tiny, until you can manage to change the scaling to the approperate setting. At least, on macOS, it scales perfectly, as long you have a monitor above 218+ DPI. It makes me wonder how macOS will handle 8K since it's larger than 218 DPI.
 
  • Sad
Reactions: Chuckeee
4k is terrible on 27+ inch screens and that is after using 5K displays since 2022. They are only meant for monitors under 24 inches. In fact, my main setup which my M1 Pro MacBook Pro now M4 Max hook up to a Thunderbolt dock to 2 Studio Displays (VESA version) with two stand alone monitor mounts. Yes, it makes it looks like a generic computer monitor, but it does the job as I need pixel perfect for developing apps. Also, the fact that 4K at 1440P does not map properly as 4K does not have enough pixels and 1080P scaling is way too big.

I bought a 4K monitor for my father that has a M1 Mac mini since he probably doesn't care and only use it for web browsing. I tried 1440p scaling and it doesn't look good. Also, 1440p is not acceptable anymore with recent macOS versions as Apple got rid of text antialiasing. While the text does not look blurry, it looks pretty bad without the anti-aliasing than a 5K. While a matte 5K monitor like the Samsung and Asus one won't be nearly as sharp as a Studio or LG Ultraphine since it's not glossy, it does look better than 1440p and 4K at 1440p scaling. I have a Viewfinity S9 I got for cheap as a secondary display to replace the 1440p and it's better, but not as good as the Studio Display.

If you can afford it, the Studio Display is a good long term monitor as they will last a long time. I don't do competitive gaming, so I don't need high refresh rates. Even the matte 5K will be better than 4K on macOS any day. 4K depending on the user, is not good for 1440p and it will never be since it does not have the DPI. If you can live with 1080p scaling with everything looking too big, that is fine.

At least macOS scaling approach is better than Windows. Windows can be a hit or miss as the apps can support it, or not, and look too small or have blurry text. What infuriates me is the app having their own scaling setting, meaning on the initial launch, it will look really tiny, until you can manage to change the scaling to the approperate setting. At least, on macOS, it scales perfectly, as long you have a monitor above 218+ DPI. It makes me wonder how macOS will handle 8K since it's larger than 218 DPI.
This is so wrong it is comical. 4k 27 inch screens are fantastic. I am using one right now (Acer). It goes to 160Hz, and works fantastic in both windows and mac just fine. It's literally on sale for $300 on Amazon right now.

1440p scaling looks absolutely fantastic in mac. So there are other sides to the argument than just yours.

Also, windows 11 has improved upon the scaling situation massively. As have just apps updating over the years. For instance, Scrivener used to suffer from the exact situation you describe, but it was fixed years ago. Most apps are fine now.
 
This is so wrong it is comical. 4k 27 inch screens are fantastic. I am using one right now (Acer). It goes to 160Hz, and works fantastic in both windows and mac just fine. It's literally on sale for $300 on Amazon right now.

1440p scaling looks absolutely fantastic in mac. So there are other sides to the argument than just yours.

Also, windows 11 has improved upon the scaling situation massively. As have just apps updating over the years. For instance, Scrivener used to suffer from the exact situation you describe, but it was fixed years ago. Most apps are fine now.
I'd have to disagree with the "absolutely fantastic" part, but otherwise yea, 27" 4K displays using macOS's "like 1440p" scaling looks fine. As soon as you look close, you can see a weird sheen around text as it can't display a perfect pixel-to-pixel rendering of this scaling, hence the need for a 27" 5K which does a perfect scaling of "like 1440p".
 
I'd have to disagree with the "absolutely fantastic" part, but otherwise yea, 27" 4K displays using macOS's "like 1440p" scaling looks fine. As soon as you look close, you can see a weird sheen around text as it can't display a perfect pixel-to-pixel rendering of this scaling, hence the need for a 27" 5K which does a perfect scaling of "like 1440p".

If you have a bionic eye and you are able to see this difference, then you should also be able to see the dot pitch on the 27" 5K monitor, and it should even bother you, but contrariwise you say the 5K looks great.
 
there are some problems here:

As I said - the Windows approach is far better on paper, the Mac method is more robust in practice.

In conclusion: macOS offers both types of scaling, while Windows offers only one type of scaling with the zoom option, which creates other issues.
MacOS doesn't offer an equivalent to Windows' freely-adjustable zoom/PPI. You get 100%, 200% (I think 300% is supported by the OS - wait for 8k displays!) - On Windows you can choose anything in between.

Posts about this topic often describe it as an objective science. A monitor with this many pixels is objectively worse than one with that many. This is all true, but actually, the most useful thing is for an individual to do some A/B testing.
The "objective science" is that 5 is greater than 4 and 5k@27" works better with MacOS than 4k@27". At typical viewing distance, 4k@27" pixels are just within the retina/20:20 vision limit for typical (not average or best) eyesight. 5k adds a comfortable margin - and even if you want to use fractional scaling to get a different UI size.

So A/B comparisons - or first impressions after "downgrading" from a 5k iMac - are just going to confirm that 5 is more than 4.

The test is - if possible - use 4k to actually do some work for half an hour and see if you notice any problem. Only if you're a YouTube influencer or blogger does your "work" involve climbing up on the desk and doing A/B comparisons with a jeweller's loupe!

Subjectively, you have to take into account that a 5k display costs 2-3 times as much as a 4k display and leaves you with a very limited choice of models.

I don't think anybody has anything negative to say about the picture quality on the Studio Display (apart from things like OLED/true HDR and HFR which aren't happening at 5k just yet) - just secondary things like the extra cost of a proper adjustable stand, no secondary video inputs, captive mains cable etc.

4k is terrible on 27+ inch screens and that is after using 5K displays since 2022.
No, it's not "terrible". It's slightly less "crisp" than 5k and occasionally show some minor artefacts due to fractional scaling (which you can avoid by switching mode and putting up with a slightly chunky UI). YMMV whether that obviates being able to buy a double or triple display setup for the price of one 5k.

Calling it "terrible" is complete hyperbole.

Plus - it depends what 4k are you comparing with the iMac 5k, which also had a decent colour gamut, great antiglare coating etc. which will make the cheapest 4k panels look rubbish, regardless of resolution.

Also, 1440p is not acceptable anymore with recent macOS versions as Apple got rid of text antialiasing.
Two problems there. (a) no, Apple really haven't got rid of text antialiasing and (b) What the Mac calls "2560x1440" has nothing to do with a 1440p display beyond the apparent size of system fonts.

The latter is partly the fault of the silly way that apple describes HiDPI screen modes, but "2560x1440" on a 4k display is actually 5k downsampled to 4k and has nothing to do with an actual 1440p display.

What Apple have done is removed subpixel antialiasing from text. That took advantage of the fact that each "pixel" on the physical display is actually 3 sub-pixels for R G and B, so by tweaking the individual R, G and B values of pixels around the edge of shapes you could "smooth" the edges even more effectively than regular anti aliasing. Very effective when it works, but also not practical when you've got things like OLED displays with different R/G/B subpixel layouts, display stream compression, translucent display elements and, indeed, scaled display modes coming down the pike - which can potentially break subpixel antialiasing and give your text horrible rainbow fringes. (I'm pretty sure that's why MacOS mis-identifying screens as YPbPr instead of RGB used to be such a problem). Subpixel antialiasing would be pretty pointless on "retina" displays anyhow, where the subpixels are at least 1/3 of the supposed "too small to see" limit.

That has possibly reduced text quality on true, standard def, 1440p screens but is irrelevant to "looks like 2560x1440" mode on a HiDPI screen.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.