Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hajime

macrumors G3
Original poster
Jul 23, 2007
8,092
1,368
Hi, I like 4K displays but unless it is on a 50" display, I need to scale up the fonts to see the letters. This includes 16" MBP running Windows via Bootcamp and Lenovo laptops with a 4K screen. For those who have been using 5K or even 6K displays, do you find the extra resolutions useful in practice? Can you really tell the difference between 4K, 5K and 6K? Is 5K and 6K better for viewing videos and photos vs. for programming and text reading? I guess there is a limitations to human eyes so above certain resolutions, we cannot tell the differences.
 
macOS has this great hiDPI mode. That‘s why for example the GUI of the 5K iMac size-wise looks like 2560 x 1440, but everything is much sharper than on a “native“ 2560 x 1440 panel. So yes, 5K and 6K are immensely useful (on macOS).
 
The main point of Retina and HiDPI displays is superior text clarity.

It's less noticeable for people using Western alphabets. It's REALLY noticeable for people using logographic characters: Chinese, Japanese, Korean, maybe Arabic script and some others.

Find two monitors: one Retina, one not. Then visit a site like Nikkei.com and look at the characters. You don't even need to be able to read it. However you will be able to see with your own eyes what a difference Retina/HiDPI makes. It is night and day.

This is nothing new. Apple introduced this where it made the most sense: the iPhone screen.
 
It's less noticeable for people using Western alphabets. It's REALLY noticeable for people using logographic characters: Chinese, Japanese, Korean, maybe Arabic script and some others.
That is very interesting to know, never thought about that. Thank you!

The main point of Retina and HiDPI displays is superior text clarity.
Personally, I think that also Western alphabets gain much but this is really very different individually. One of the main reasons I switched to Mac was the Retina display. Since then, I praise high resolution displays but I have learned from friends and colleagues that many of them just don't care. With many of them, I made the test and we looked at the same text or website on different displays side by side - I was curious. All of them stated that HiRes-displays are sharper but a healthy majority said they didn't care much and would not spend more money for better display resolution. Fascinating!
 
As a software developer, I find I can't bare to look at most windows machines. I can't believe they are still putting 1080p screens on laptops and large monitors, reading text on those screens is tough.

IMO, for text on anything above 23" you ideally want 1440p. Text loses it's clarity at that point, I once downsized at work when they "upgraded" me t a 1080" 27" monitor. All I could see was the pixels in the text...
 
It's less noticeable for people using Western alphabets. It's REALLY noticeable for people using logographic characters: Chinese, Japanese, Korean, maybe Arabic script and some others.
There is also the effect that even when it's not consciously noticeable, it _is_ noticeable to your eyes and your brain. The text is easier to read, even though you can't figure out _why_ it's more readable.
 
It's easier to see in a side-by-side comparison. This is not unique to video displays.

Pretty much everything is like this.

I used to work at a company that switched from offset lithography to digital printing for product labels and the difference was very noticeable to me even if the latter was still very high quality.

Some people's senses are less attuned than others.

Remember the DVD to Blu-ray transition? There were a lot of people who claimed that they couldn't see a difference between NTSC resolution and HD resolution.
 
Hi, I like 4K displays but unless it is on a 50" display, I need to scale up the fonts to see the letters.

Also true of 5k displays - "raw" 5k mode would be unusable on a screen who's size in inches was less than your age in years. 5k Macs don't even let you select "raw 5k" mode without jumping through hoops - they default to the "looks like 2560x1440" HiDPI mode which scales up the system text, icons etc. by x2. The result is that, on a 27" display, the system text, icons etc. are the same size as on an old 1440p non-retina iMac or Thunderbolt/Cinema display, only much sharper.

Now, some 4k displays will come up in "raw 4k" mode with tiny text, but the optimum mode would be "looks like 1920x1080" mode which, on a 21" screen, gives the same sized text as a pre-retina 21.5" iMac but again, much sharper. On a larger, 27" or so screen, the system text and icons in that mode start to look a bit big, and use a lot of screen "real estate" but they're still pin-sharp unless you have bionic vision.

Those modes are the optimum for both 4k and 5k respectively - most modern software will be rendering everything at the screen's "native" resolution and any old software or low-res bitmaps will be scaled by exactly 2x, which minimises any 'artefacts'.

Thing is, if you want to stick with those "optimum" modes you end up wanting to choose your screen resolution to match the display size.

So the sweet spots for MacOS are 21" @ 4k, 27" @ 5k, 32" @ 6k, and by moving up you're not adding sharpness, you're adding screen area.

(Those sweet spots are a bit of a Mac thing - Windows lets you freely choose the screen scaling, which sounds better than Mac but is very dependent on software behaving itself. I wouldn't bother with 5k on windows, and based on the number of 5k displays currently widely available - order-of-magnitude 0 - the world agrees with me).

So that's a problem because the only 27" 5k display currently available in most places (the LG one is widely unavailable) comes with a sometimes inconveniently attached iMac, while the only 6k display on the planet costs $6000 or $7000 if you can't find some bricks to prop it up on.

So, if you want to get an affordable 27" or larger display, the options are:
(a) get that 50" 4k model that you can read in "raw 4k" mode.
(b) Get a smaller 4k and run it in one of the "non-optimum" scaled modes.

(b) is probably the answer if you've got a Mac with a half-decent GPU. The "non-optimum" scaled modes work by rendering to an internal buffer at twice the "looks like" resolution and then re-sampling to the native display resolution. The result is very, very good - don't be put off by past experience of running a low-res LCD in non-native mode.

I have a 5k iMac and, as a second screen, a cheapo 28" 4k Dell. The colour reproduction on the Dell isn't a patch on the iMac (better 4k screens are available), but in terms of sharpness it is very good. I normally run the Dell in "looks like 2560x1440" mode which effectively means that the screen is rendered to 5k and then downsampled to 4k. Size/screen-estate wise, the result is the same as the 5k (well, 1" bigger) - The result is slightly "softer" than true 5k but most people with mortal eyesight would only worry if they looked unnaturally closely.
 
  • Like
Reactions: hajime
Recently it looks like they have 4K2K Ultrawide displays. How are those compared with regular 4K display?

I tried one of those earliest curved monitors by LG but I felt very sick using it for less than 5 minutes.
 
Not for the 27-inch size in most cases—the price to go to a 5K display over a 4K display is massive and the returns are pretty limited IMO. For 32 inches, I think 5K or 6K makes a lot more sense to improve the clarity of text, especially small text.
 
Recently it looks like they have 4K2K Ultrawide displays. How are those compared with regular 4K display?
Er... wider?

BTW: I don't think 4K2K and Ultrawide are the same thing - not that any of these 4k/5k marketing labels are used consistently, anyway.

"4k" means any display that is around 4000 pixels wide - most commonly it is used to refer to "UHD" (3840 x 2160) so apparently 3840 is "close enough" to 4000. There's also a TV-industry "DCI 4k" which is 4096x2160, the 21.5" iMac uses 4096x2304.

I can't find a straight definition of "4K2K" anywhere, but it could mean any display that is around 4000 pixels by 2000 pixels - which would be all of the above - and seems to be mainly used to refer to plain old "4K UHD".

"Ultrawide" usually means an aspect ratio of 21:9 or wider, and tells you nothing about the resolution. You've got to look at the vertical resolution and compare it to a standard-width (16:9) display of the same physical height - which means either digging through the specs or revising your Pythagoras because the adverts always show the diagonal size. Often, with an ultrawide display, that's only something like 1080 or 1440 pixels, so the screen isn't going to be any sharper than a non-retina 1920x1080 or 2560x1440 display. Basically, they're for gaming.

Likewise, although 5K here usually refers to 5120x2880 displays as per the 5k iMac, there are 5K Ultrawide displays with 5120x2160 resolution. These aren't going to be any sharper than a good 4k UHD display of the same height - you'll just get more space to the left and right.

Also remember that "sharpness" and whether a display is "retina" depends on viewing distance, which is a personal preference.... unless you've got a curved screen, in which case your head is presumably supposed to be at the centre of the circle...
 
  • Like
Reactions: hajime
Also true of 5k displays - "raw" 5k mode would be unusable on a screen who's size in inches was less than your age in years. 5k Macs don't even let you select "raw 5k" mode without jumping through hoops - they default to the "looks like 2560x1440" HiDPI mode which scales up the system text, icons etc. by x2. The result is that, on a 27" display, the system text, icons etc. are the same size as on an old 1440p non-retina iMac or Thunderbolt/Cinema display, only much sharper.

Now, some 4k displays will come up in "raw 4k" mode with tiny text, but the optimum mode would be "looks like 1920x1080" mode which, on a 21" screen, gives the same sized text as a pre-retina 21.5" iMac but again, much sharper. On a larger, 27" or so screen, the system text and icons in that mode start to look a bit big, and use a lot of screen "real estate" but they're still pin-sharp unless you have bionic vision.

Those modes are the optimum for both 4k and 5k respectively - most modern software will be rendering everything at the screen's "native" resolution and any old software or low-res bitmaps will be scaled by exactly 2x, which minimises any 'artefacts'.

Thing is, if you want to stick with those "optimum" modes you end up wanting to choose your screen resolution to match the display size.

So the sweet spots for MacOS are 21" @ 4k, 27" @ 5k, 32" @ 6k, and by moving up you're not adding sharpness, you're adding screen area.

(Those sweet spots are a bit of a Mac thing - Windows lets you freely choose the screen scaling, which sounds better than Mac but is very dependent on software behaving itself. I wouldn't bother with 5k on windows, and based on the number of 5k displays currently widely available - order-of-magnitude 0 - the world agrees with me).

So that's a problem because the only 27" 5k display currently available in most places (the LG one is widely unavailable) comes with a sometimes inconveniently attached iMac, while the only 6k display on the planet costs $6000 or $7000 if you can't find some bricks to prop it up on.

So, if you want to get an affordable 27" or larger display, the options are:
(a) get that 50" 4k model that you can read in "raw 4k" mode.
(b) Get a smaller 4k and run it in one of the "non-optimum" scaled modes.

(b) is probably the answer if you've got a Mac with a half-decent GPU. The "non-optimum" scaled modes work by rendering to an internal buffer at twice the "looks like" resolution and then re-sampling to the native display resolution. The result is very, very good - don't be put off by past experience of running a low-res LCD in non-native mode.

I have a 5k iMac and, as a second screen, a cheapo 28" 4k Dell. The colour reproduction on the Dell isn't a patch on the iMac (better 4k screens are available), but in terms of sharpness it is very good. I normally run the Dell in "looks like 2560x1440" mode which effectively means that the screen is rendered to 5k and then downsampled to 4k. Size/screen-estate wise, the result is the same as the 5k (well, 1" bigger) - The result is slightly "softer" than true 5k but most people with mortal eyesight would only worry if they looked unnaturally closely.
Thanks for your info here. My LG 5k is getting old, and I've been wondering whether replacing it with a Dell 27 inch 4k would be viable. I'm after that sweet combination of real estate for all the menus and palettes in Adobe apps, with sharpness that the 5k brings.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.