Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I use a 27" monitor set to 2560x1440.

When I use a 5k monitor, like my now retired 27" iMac, and scaled to 2560x1440, it looks perfect.
However, a new 5k monitor for my Mac Mini is just a bit to pricey for me.
I tried a 4k, 3840x2160 monitor, and set it to 2560x1440. Because it wasn't a neat fractional scaling, everything looked a bit fuzzy. This has also been mentioned in various places on the Internet.
So I saved some money and bought a QHD, 2560x1440 monitor. It looks better than the more expensive monitor scaled down to 2560x1440.

As said above, it's what suits you.

If you want to use a 3840x2160 monitor, set to 3840x2160, then you will get the best setting.
If you set to it 2560x1440, it will look fuzzy. Get a native 2560x1440 monitor, you will save money and it will look better.

Anyway, some of remember when all you could get was 640x480, and we were all happy with that.

View attachment 2463801

BTW, the reason I am quoting resolutions explicitly is that around here, the advertising is a bit fuzzy about what UHD, 5k, QHD, 4K, HD, 2k, etc, all mean.
This is the exact same conclusion I came to - 27" QHD at 2560x1440 is the sweet spot for price/function/performance. Excellent for long sessions.

I had an iMac 27" and 2x 28" 4K monitors. The iMac screen was fantastic - the Samsung 4Ks not so much.

So when I upgraded to the Mac Studio M1 Max I went with 3x 27" HP QHD monitors at 2560x1440. Very pleased. May buy one more. I feel 32" is just too big for the same real estate - would need to go to much larger 49+ screens. Multiple 27s is more flexible than the monster screens.
 
  • Like
Reactions: Chuckeee and genexx
As said above, it's what suits you.
...with the added consideration that 5k costs 2-3 times as much, and limits your choice to about 3 available models.

Newsflash: if you put a 4k display next to a 5k display, the 5k display is going to look sharper. The question is whether anybody is going to actually notice that in daily use - especially if they haven't been "spoiled" by a 5k display in the past.

There's little doubt that 5k is optimal for MacOS and apart from MacOS's DPI considerations, well, 5k > 4k. Or, you could buy 2 4k displays and have massive screen "real estate" plus change left over from the price of a single 5k.

It comes down to whether you're prepared to pay the hefty price difference for the small increase in quality.

If you want to use a 3840x2160 monitor, set to 3840x2160, then you will get the best setting.
Or set it to "looks like 1920x1080" and get exactly the same 3840x2160 resolution, and no fractional scaling, but with usable-sized UI elements and readable system fonts. All of those online articles about how 4k is rubbish tend to conveniently forget that option.

The system UI is a bit "big" for some people but works perfectly well (it's still a lot better than that 640x480 screen you posted!) Unless you're running ancient pre-retina software (which probably won't run on M-series anyhow) the only practical difference between "looks like 3840x2160" and "looks like 1920x1080" is the system UI size and everything else is being rendered in 1:1 4k.

Get a native 2560x1440 monitor, you will save money and it will look better.
If you're happy with your display, that's fine - after all, 2560x1440 was the bees knees not many years ago - but it isn't "better" by any objective standard. "Looks like 2560x1440" on a 4k display is actually 5k downsampled to 4k and renders a lot more detail than can be displayed on an actual 1440p display.

The way Apple describes 4k modes as "2560x1440" when they're really nothing of the kind (they've even dropped the "looks like" now) is, I think, a big part of the problem.

Plus, a 4k display can actually display 4k video and images at 1:1 4k resolution (even if you have to take a few seconds to change to 1:1 or 2:1 mode) - a 1440p display simply can't.

Also, Apple aren't really considering standard-def mode as a priority, so a lot of people are complaining that changes to font rendering etc. (such as removing sub-pixel anti-aliasing) which make sense on modern displays but make standard def look worse.

This blog goes a bit more in depth into why some people don't like the non-integer scale:
That blog (TLDNR - $500 4k displays have drawbacks so buy a $5000 Pro XDR display like I did) - which always gets quoted when this subject comes up - is also part of the problem.

It's not exactly false & does clearly describe the source of scaling artefacts - but hugely exaggerates them by showing enlarged "simulations" of worst-case-scenario (like 1 pixel checkerboards - using those anywhere would be insanely bad design) that you'll hardly notice in normal use. Then presenting a load of prescriptive good/bad tables which act like "looks like 1440" is the only possible option, even on 24" and 32" displays where 1:1 "looks like 1920x1080" and "3840x2160" - respectively - would be the obvious modes to use.
 
I wish Apple would fix/update macOS to support popular 3rd party display resolution and ppi.
It does, and on a 4k UHD it offers you a choice between 2:1, 1:1 and various fractionally scaled modes.

If you mean switching to the Windows method of adjustable PPI - that can't be done without having developers rewriting their applications to call the appropriate OS functions to get the correct scaling when rendering, creating bitmaps etc. - something that has been established as "best practice" in Windows since at least Windows 3.1 while MacOS started out with 1 point = 1 pixel. My impression is that the Windows way is technically better but MacOS is more robust (particularly if you have a mixture of standard and high def displays as you often get with laptops plugged into externals). Last time I used Windows like that, some apps occasionally decided to mess up the scaling and present unusably small/large UIs or messed-up dialog box layouts.

They aren't offering any external displays directly.
Yes they do - Studio Display and Pro XDR. Before that the LG Ultrafine 5k was very much an Apple/LG collaboration.

If they have difficulty providing a 32" 6K iMac why not go back to offering a 27" 5K one that starts at $1799 with a M4 16GB 256GB?
1. Because they can make more money selling you a $1600 Studio Display and $800 Mac Mini.
2. Because the market for the 5k iMac was almost certainly shrinking and reducing the economy of scale that allowed them to offer an iMac with a 5k display for what was always (for Apple) a knock-down price.

You have
- a general shift to laptops anyway
- Apple Silicon pretty much removing the performance advantage that Mac desktops previously had over laptops
- The 24" iMac with a larger, higher-res screen and faster processor than previous "small" iMacs taking a bite out of low-end 5k iMac sales
- Mac Studio taking a bite out of high-end iMac sales - some people liked the iMac, some people only bought an iMac because, at the time, it was the only viable desktop Mac on offer. The Mac Studio was a huge improvement for many,
- M-series Mac Mini taking a bite out of all iMac sales.
- Almost zero uptake of 5k in the PC world - which would have brought down 5k panel prices, something that Apple may have counted on when initially setting iMac prices.

Apple can sell Minis and Studios to people who want to use 3rd party displays, and they can sell Studio Displays (and maybe the occasional Pro XDR) to MacBook users who want the ultimate docking station. Makes for a much more flexible desktop range.
 
That blog (TLDNR - $500 4k displays have drawbacks so buy a $5000 Pro XDR display like I did) - which always gets quoted when this subject comes up - is also part of the problem.

It's not exactly false & does clearly describe the source of scaling artefacts - but hugely exaggerates them by showing enlarged "simulations" of worst-case-scenario (like 1 pixel checkerboards - using those anywhere would be insanely bad design) that you'll hardly notice in normal use. Then presenting a load of prescriptive good/bad tables which act like "looks like 1440" is the only possible option, even on 24" and 32" displays where 1:1 "looks like 1920x1080" and "3840x2160" - respectively - would be the obvious modes to use.
It's just to show the issues. 24" 4K monitors also basically don't exist. They are pretty much as rare as 5K displays with a handful on the market for high prices. The vast majority are 27" or 32". 32" are also large enough that they kind of stop being Hi-DPI displays and start being very sharp low DPI displays.
 
24" 4K monitors also basically don't exist.
True - although that seems to be a recent development, they were around when that blog was written.
Plenty of 24" 4k TVs with HDMI around, though, if you're on a budget.

The vast majority are 27" or 32". 32" are also large enough that they kind of stop being Hi-DPI displays and start being very sharp low DPI displays.
...but then you have that blog putting a 1440p 34" display in a "green zone" and a 32" UHD display with higher PPI in the "red" zone - all based on the conceit that you have to use "looks like 1400p" mode everywhere, and that most apps allow you to freely zoom in and out on the actual content you are working on - which isn't always going to be presented using integer scaling anyway.

The "retina/hi-ppi" criteria is based on angular resolution (1 arc minute per pixel, consistent with the definition of 20:20 vision) which depends on viewing distance, not just PPI. Some people have better vision than that, others have worse. A 4kUHD, 27" screen is "1 arc minute per pixel" from a viewing distance of about 21" if I recall - so in terms of being able to "see" the pixels (or 1-pixel sized artefacts) at a normal viewing distance, anything more than 4k@27" for a desktop display will give you diminishing returns. That magic "218ppi" figure is what counts as retina on a laptop - it's somewhat overkill on a desktop display.

The real point of a high-dpi/retina display and low-dpi/nonretina display is that a retina display makes the individual pixels - or pixel sized artefacts - much harder to notice so you don't have to worry about scaling artefacts etc.

...so its all rather a piece of string depending on how far you like to sit from the display, but generally you'll sit further away from larger, desktop-mounted displays than you would smaller ones.
 
I appreciate this tutorial, @theluggage which helps me better understand the origins of the differing opinions (and options) within this cult of Mac.

Is the PPI stuff going to be on the final exam? 😇

Prior to 2020, when I needed a 4K monitor for Windows laptop WFH, my Macs were either laptops or a mini which was connected to a TV (pre-Apple TV), so like many, I don't have any experience with 5K or large iMacs, and was barely paying attention to the stories about them over the years.
 
Last edited:
True - although that seems to be a recent development, they were around when that blog was written.
Plenty of 24" 4k TVs with HDMI around, though, if you're on a budget.


...but then you have that blog putting a 1440p 34" display in a "green zone" and a 32" UHD display with higher PPI in the "red" zone - all based on the conceit that you have to use "looks like 1400p" mode everywhere, and that most apps allow you to freely zoom in and out on the actual content you are working on - which isn't always going to be presented using integer scaling anyway.

The "retina/hi-ppi" criteria is based on angular resolution (1 arc minute per pixel, consistent with the definition of 20:20 vision) which depends on viewing distance, not just PPI. Some people have better vision than that, others have worse. A 4kUHD, 27" screen is "1 arc minute per pixel" from a viewing distance of about 21" if I recall - so in terms of being able to "see" the pixels (or 1-pixel sized artefacts) at a normal viewing distance, anything more than 4k@27" for a desktop display will give you diminishing returns. That magic "218ppi" figure is what counts as retina on a laptop - it's somewhat overkill on a desktop display.

The real point of a high-dpi/retina display and low-dpi/nonretina display is that a retina display makes the individual pixels - or pixel sized artefacts - much harder to notice so you don't have to worry about scaling artefacts etc.

...so its all rather a piece of string depending on how far you like to sit from the display, but generally you'll sit further away from larger, desktop-mounted displays than you would smaller ones.
What 24" 4k TVs? They don't exist at all. I don't even think there's been a single model. If you can find a 24" TV it's almost always 1080p or even 720p. But even those are rare. Maybe it's different outside North America. And there's only ever been a handful of 24" 4K displays. They were never common.

I don't really care about the green zone stuff, I mainly linked that article because it gives a good example of shimmer and moire patterns. Plenty of people notice those and the blur at totally normal desktop sitting distances. Some people don't or don't care BUT THAT'S WHY PEOPLE CARE.

Again, personally I don't really care (but I could see it!).
 
Or how much desk space you have available LOL

I'm coming from Windows 10 laptop to MacOS with 4 monitor setup with 1x 27" Dell U2723QE 4K, 2x 24" Dell U2410 1080p, 1x 14" 2560x1600 portable monitor via USB-C non-thunderbolt dock. Need a new Thunderbolt 4 dock fir the other monitors.

Well there is physical desk space, which .. my 27" 5K monitor is for all practical purposes the same size as my 27" 4k monitor, but from you listing all the monitors maybe the point you are trying to make it is easier to have more 4k monitors than 5k , and hence.. more digital desk space?
 
What 24" 4k TVs? They don't exist at all. I don't even think there's been a single model.
OK - I may have been thrown off by some bad search results there.

I don't really care about the green zone stuff,
...but simplistically flagging screens as good/bad like that is not helpful to people trying to decide on a display - and I've often seen those tables reproduced on their own.

Also, you've got some people using that article to suggest that getting a 1440p display is a good idea when such a display wouldn't be capable of rendering the single 4k pixel details in those illustrations at all.
 
Been following this thread with interest; I recently researched a monitor purchase and was torn between a 4K 27" Dell or 5K 27" ASUS ProArt display; I was edging toward the ASUS, but then Woot! had some 'open box' condition Dells of the specific model I wanted and that created a nearly $480 effective price difference (figuring in sales tax concerns, etc...), so I got the Dell. I have a thread detailing my researching the topic and various factors involved in display shopping. Initial Thoughts on Dell UltraSharp U2723QE.

One problem is comparing monitors is it's often an 'apples to oranges' comparison. For example, people compare 3rd party options to the Apple Studio Display. Yes, the ASD has 5K high resolution, 600 nits brightness and excellent color accuracy right out of the box per reviews, and you can control brightness and volume from your Mac keyboard (I believe). For around $1,600 without the height adjustment version of the stand. But it's a package deal:
1.) Lauded metal build quality - on something that sits on your desk, perhaps rarely touched.

2.) Thunderbolt - but Thunderbolt displays are often more expensive, and the ASD's USB-C out ports aren't Thunderbolt. I wonder what the point of it being 'Thunderbolt' is?

3.) It has spatial audio - reviewed as much superior to most monitor speaker systems. Great...but do you have external speakers you use instead of your monitor's?

4.) You don't get VESA mount holes unless you forego having a stand (IIRC). The height-adjustable stand is $400 more!

5.) Not only is it a very bright monitor, but it's glossy where's it seems most competitors are matte to cut down on glare. That's going to give it that extra 'pop' in some settings.

6.) It is Thunderbolt only made made to work with Macs; it can be used with a PC, but from what I've heard that's an inferior experience, and you won't be hooking it up by HDMI. Don't expect to use this with a gaming console.

7.) Due to high cost, you'll probably buy AppleCare for it, adding even more to the price.

If we ignore cost concerns, does the ASD offer Mac users a superior experience to 3rd party 4K monitors? I expect so, particularly if you care about the solid build quality and look, webcam with Center Stage, etc...

But I'm not paying that. You could invest a large amount in an ASD expecting to use it over 10 years, etc..., but will you regret that if in a year or two they put out a superior next generation model (say, 90 or 120-Hz refresh rate)?

I think the closest 'apples to apples' comparator for 3rd party 27" 4K monitors is the new ASUS ProArt PA27JCV, and you can read more about it (including links to excellent reviews) in this thread - 5K, 6K or 8K: new Asus ProArt monitors. Here's my reasoning:

1.) 27" 5K Display with strong reviews and lauded color. Note: it is matte.
2.) Mainly plastic body, like many 4K 27" displays.
3.) Speaker system underwhelmed reviewers. No webcam or Center Stage.
4.) USB-C DisplayPort Alt. Mode but not a Thunderbolt monitor.
5.) A very recent release, so you won't see much in the way of sales for awhile.
6.) Around $800. B&H Photo&Video has it, and if you use their PayBook credit card (no annual fee), you can get the sales tax discounted, so roughly $800 and it's your's (at least in the continent U.S.?).
7.) It's not made to work with Macs at the expense of PCs. If you decide to use it with a PC, it should not offer an inferior experience.

This is what I would (and did) compare against a 4K 27" monitor under consideration.

But the display world is evolving pretty fast. OLED is on the rise. Every 5K 27" display I know of is 60-Hz, but demand for 'gaming' displays may make higher refresh rate 4K options available. We now have Thunderbolt 5 and a new HDMI spec. may come out soon (January 2025?), so driving high res./high refresh rate displays may get more practical.
 
Obviously 4K are still very widely the standard hidpi monitor resolutions, and yet there is so much discussion about 4K vs 5k.

I’ve got a 27” 4K and I think it… looks fine? It’s running at “1440p” or whatever you’d call it.

As I understand it, Mac likes the half the resolution and 4K isn’t 2x1440p, so it actually mimics 5k internally then scales that to the 1440p

And back to my original point… everything seems to work/look fine? So why is everyone acting like 4K is such a problem?

I’m potentially considering a 32” monitor, so this is mainly why it’s also popping up in my mind. Add in that I might consider a different screen ratio… 16:9, 16:10, or ultra wide…
Because many people have never really tried a 4K monitor and are influenced by what they read on the internet.
Mac OS always offers a sharp display, thanks to its refined upscaling mode.
Fractional scaling with a non-1:1 ratio is an issue for only a few people, so 4K is perfect for 98% of users.
 
  • Like
Reactions: Chuckeee
But the display world is evolving pretty fast. OLED is on the rise. Every 5K 27" display I know of is 60-Hz, but demand for 'gaming' displays may make higher refresh rate 4K options available. We now have Thunderbolt 5 and a new HDMI spec. may come out soon (January 2025?), so driving high res./high refresh rate displays may get more practical.
Yeah - thats an important consideration if you're thinking about "investing" in a display for the next 10 years.

If you could have bought a display with the 5k iMac panel circa 2015-2017, and assuming that nothing went wrong, you'd still have a really good display today, with not a lot around to beat it - but that's kinda the problem: the panel in todays Studio Display is only a slightly brighter variant on the same 2015 technology (I think the P3 gamut and new anti-reflective coating came along in 2015?) so is that 10 year-old tech still going to be hard-to-beat 5-10 years down the line? It'll still be a good display, but quite likely outclassed by OLED or microLED panels. That has to be weighed against the significant premium you'd be paying over a 4k screen and all the other non-panel-related drawbacks you mention.

That said - one benefit of the Studio Display that gets missed is the anti-glare coating: not the nano option, which I've never tried, but the "standard" optical coating which - I'm assuming that it is the same as the 5k iMac - is really, really effective. Personally I'd always prefer a shiny display over matte: if you've got a bright light source right behind you you're holding it wrong anyway and a matte display just disguises the problem while still decimating the contrast. If you've got light coming from the sides then a glossy display reflects it off to the other side, away from your eyes, while a matte display scatters it into your eyes and destroys the contrast. Apple's optical coating is really the best of both worlds, keeping a glossy display while significantly dimming the reflections.

While I'm generally happy with my current 4k+ displays, which have a fairly good, non-sparkly matte finish, is the way the picture gets completely wiped out by a bit of sunlight from the window to the side, c.f. my old 5k iMac.
 
I've been out of Mac land for the past 7 years or so, but the M4 Mini has me really tempted. I don't have any 4K or 5K monitors though and was going down the same path as drrich2. I simply can't justify the cost of an ASD, but the Dell U2723QE and Asus ProArt 5K displays? Maybe! Anyway, just wanted to say thanks to theluggage and drrich2 for their insights above.
 
  • Like
Reactions: JSRinUK and drrich2
I was happy with a 27" monitor and have had a 32" monitor @ 1440 for a while now. It looks great to me, and I can do some photo work with no issues. I also have a lower-end "graphics" monitor, which is not cheap but not crazy expensive.

Yes, Apple seems to be able to ignore certain things people want because in some ways, they are the only game in town if you don't want to be on Windows. One would hope Apple would be at the forefront of great graphics and adaptability but alas, not so. I should also mention that in the beginning of the M chip Macs, many people had issues with certain monitors and Apple FAILED to address these timely. Imagine having a great monitor only to find the new Mac cannot be hooked up to it. Shameful on Apple's part to at least not warn people. Some monitors worked okay with a slight tweak and others did not.
 
If you could have bought a display with the 5k iMac panel circa 2015-2017, and assuming that nothing went wrong, you'd still have a really good display today, with not a lot around to beat it
I'm transitioning off a 2017 5K 27" iMac right now, and yes, the display is beautiful. And I've been a personal computer user since starting with a Commodore 128 system as a high school graduation present.

For a long time, displays slowly got larger with moderately higher resolution - 14" to 17" to 24" to 27", etc... But I think refresh rates have been mostly at 60-Hz for a long time. And displays mainly focused on office work or home users with similar wants/needs. People played some games on their systems, but I don't recall seeing much mention of gaming monitors as a class until more recent times. The move from CRT to flatscreen was big.


Lately things have changed. Gaming has grown mightily, as have demands of hardware. We don't see it as much on the Mac, but on the PC side some gamers spend a lot of money on non-integrated video cards. Competitive multi-player online games made display performance more important.

So now we've got escalating refresh rates, lower latency, a shift to OLED (at added cost and not as high res. available), 32" monitor size, local dimming zones, a lot going on now.

So a 2017 iMac display still rocks today; would an Apple Studio Display still rock 8-years from now? I wonder.
 
Last edited:
Because 4k is only really visible on 65 inch or large displays. Less than 65 inches it is a waste of money, memory and pixels.
 
Personally, I'm all in on Retina for Mac use. It's part of why I love Macs, especially MBPs. Did it hurt buying an ASD? Yes. Is it a beautiful display that I'll get to enjoy for longer than any of my previous 5K iMacs? Also yes.

I do have a PC with a much smaller dual monitor setup. I still try to get the highest resolution possible at a smaller size, since I tend to sit fairly close to my monitors and still have good eyesight. I'm fortunate that 60hz doesn't bother me, so I'm not going crazy looking for high refresh rates.
 
This is the exact same conclusion I came to - 27" QHD at 2560x1440 is the sweet spot for price/function/performance. Excellent for long sessions.

I had an iMac 27" and 2x 28" 4K monitors. The iMac screen was fantastic - the Samsung 4Ks not so much.

So when I upgraded to the Mac Studio M1 Max I went with 3x 27" HP QHD monitors at 2560x1440. Very pleased. May buy one more. I feel 32" is just too big for the same real estate - would need to go to much larger 49+ screens. Multiple 27s is more flexible than the monster screens.
Interesting.
I'm transitioning off a 2017 5K 27" iMac right now, and yes, the display is beautiful. And I've been a personal computer user since starting with a Commodore 128 system as a high school graduation present.

For a long time, displays slowly got larger with moderately higher resolution - 14" to 17" to 24" to 27", etc... But I think refresh rates have been mostly at 60-Hz for a long time. And displays mainly focused on office work or home users with similar wants/needs. People played some games on their systems, but I don't recall seeing much mention of gaming monitors as a class until more recent times. The move from CRT to flatscreen was big.


Lately things have changed. Gaming has grown mightily, as have demands of hardware. We don't see it as much on the Mac, but on the PC side some gamers spend a lot of money on non-integrated video cards. Competitive multi-player online games made display performance more important.

So now we've got escalating refresh rates, lower latency, a shift to OLED (at added cost and not as high res. available), 32" monitor size, local dimming zones, a lot going on now.

So a 2017 iMac display still rocks today; would an Apple Studio Display still rock 8-years from now? I wonder.
Note that gaming is a special category because [unlike other users] gamers do need fast response times but do not need color accuracy, brightness, etc.
 
  • Like
Reactions: Alameda
My setup: I run three monitors. My main monitor is a 5k 27" iMac. One of my side monitors is a 27" 4k Dell. So I'm continuously seeing a direct comparison. I've always run the 4k at integer scaling, so my comments will apply to that usage. [I can see the loss in sharpness with non-integer scaling, and find it unacceptable.] Integer scaling does mean the UI is somewhat bigger, but I find that to be a minor inconvenience.

About me: I do mostly text work. I've got very good eyesight (20/13), and am very sensitive to sharpness.

My findings:

Through High Sierra, MacOS offered native subpixel text rendering, which effectively increased the horizontal resolution by 3-fold. Under those OS's, I was happy with my 4k. But when I needed to switch to a newer OS, I was bothered by a loss of sharpness. I had no idea why it looked less crisp, until I did some research and found out about the subpixel text rendering. I've found the only way for text to look sharp on current MacOS's is to use a 5k, which is why I upgraded.

I've also found that, once you get to Retina pixel density (~220 ppi on an external monitor), the screen needs to be glossy for text to look optimally sharp. I did a side-by-side comparison of glossy and nanotextured ASD's, and found the text to be sharper on the former.
 
Iunno. I’m very happy with fractional scaling on a good quality 27” 4K from Dell. Matches nicely with fractional scaling (highest resolution) on my laptop; no evident artifacts; can’t see pixels unless I press my face up to the screen; can read text (aka code and docs) comfortably all day.

YMMV if you’re an artist, photographer, video editor, etc.

For larger screen dimensions, I would want higher resolution. At least 6K for 30”.
 
My setup: I run three monitors. My main monitor is a 5k 27" iMac. One of my side monitors is a 27" 4k Dell. So I'm continuously seeing a direct comparison. I've always run the 4k at integer scaling, so my comments will apply to that usage. [I can see the loss in sharpness with non-integer scaling, and find it unacceptable.] Integer scaling does mean the UI is somewhat bigger, but I find that to be a minor inconvenience.

About me: I do mostly text work. I've got very good eyesight (20/13), and am very sensitive to sharpness.

My findings:

Through High Sierra, MacOS offered native subpixel text rendering, which effectively increased the horizontal resolution by 3-fold. Under those OS's, I was happy with my 4k. But when I needed to switch to a newer OS, I was bothered by a loss of sharpness. I had no idea why it looked less crisp, until I did some research and found out about the subpixel text rendering. I've found the only way for text to look sharp on current MacOS's is to use a 5k, which is why I upgraded.

I've also found that, once you get to Retina pixel density (~220 ppi on an external monitor), the screen needs to be glossy for text to look optimally sharp. I did a side-by-side comparison of glossy and nanotextured ASD's, and found the text to be sharper on the former.
Of course text will simplistically be sharper on glossy. The issue is how much one is offended by glare. Those who are offended by glare [me] prefer the matte finishes. YMMV
 
  • Like
Reactions: leifp
MacOS works well at 100dpi, or 200 dpi, but has issues around 150dpi. So I worked out what 4k @100dpi would be, and that came out to 42 inches. So here we are:

Desktop.jpg


My friend had already done the same thing, so I was confident. (He has the magnification up higher though, and doesn't hike how I have it.). I often use this as if it were two monitors, with long windows side by side. But full screen also works nicely. In a few years we will have 8k at this size, but for now this is nice.
... And it only cost me $500. It could have been even less, but I wanted good color.
 
Well there is physical desk space, which .. my 27" 5K monitor is for all practical purposes the same size as my 27" 4k monitor, but from you listing all the monitors maybe the point you are trying to make it is easier to have more 4k monitors than 5k , and hence.. more digital desk space?
I was speaking of physical desk space only.

Any on my 27" Dell U2723QE 4K, I switched from 1920x1080 to 2624x1476 via Betterdisplay, seems ok and closer to the Windows 10 scaling I had for the monitor. Will see if I can adjust to this.
 
I have 2x Studio Displays which I love because the 5K really looks very good.

However, I have a 4K BenQ monitor at my parents house. My main issue with it is that it's not bright enough, but the 4K vs 5K for a 27" which noticeable just isn't a big deal. I can tell the difference but it doesn't ruin my day.

If you have the money then go 5K - you won't regret it. If you don't then just get a 4K and sleep sound - you're not missing all that much.
 
I used a 43" 4k Samsung TV for several years. Worked great.

Ultimately it started getting this bright line in the lower right. Invisible for TV usage but very obvious with computer usage lol.

Now I'm on a Dell 43" 4k monitor.
 
  • Like
Reactions: Algr
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.