Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I get your point, but it does go to support the idea 4K is fine for a substantial portion of people, and the person researching options may some have idea how sharp their eye sight is, how discerning and appreciative they are of fine detail differences, etc... There's still some guesswork to it.
This, exactly. I have really discerning eye sight and I love my 4k 27" gaming monitor. It looks incredibly sharp.
 
  • Like
Reactions: Chuckeee
I still have an original AppleColor 14" display up in the attic.
640x480 and nothing looked better in its time.

I'm tempted to bring it down and see how it compares in sharpness to my Dell UltraSharp...
 
  • Haha
Reactions: eltoslightfoot
I still have an original AppleColor 14" display up in the attic.
640x480 and nothing looked better in its time.

I'm tempted to bring it down and see how it compares in sharpness to my Dell UltraSharp...
That would actually be awesome haha.
 
@drrich2
How difficult would this be for Apple to implement?
That’s challenging to answer without really knowing the number of bitmap assets in MacOS… there's probably some way to scan the file system, but anyway the number is certainly in the thousands if not tens of thousands.

The real problem is that you're adding to the complexity of the OS. Right now it has a "standard" mode and a "retina" mode. You're proposing a third "subretina" mode. Now every bit of the code that asks "am I this or that" has to turn into "this or that or a third other thing". That's not difficult to implement but it is hugely time-consuming because this kind of code is all over the place.

(Oh, and we haven't even brought up third-party support. This turns into a headache for Adobe, Microsoft, etc.)

All for an experience that again many people might say is "good enough" but is objectively worse.

Quick follow up to prevent misunderstandings. I acknowledge Mac Minis + Mac Studios (desktop only Macs) are a really small % of the total Mac market. That said, many MacBooks are used predominantly in clamshell mode as de facto desktops or at least often used attached to external displays, even if also used as portables. Many of these external displays are 4K monitors in the 27" to 32" size range.

You're correct that laptops with external displays (whether in clamshell or not) should be factored into the value of this. Honestly should have occurred to me as I use an MBP (open) with a Studio Display at home.

I do know that around 90% of Macs sold are portables. I do visit a lot of offices, and my impression is that around 10-15% of users use external displays, but I wouldn't be at all surprised if I was wrong in either direction.

Do we have any credible numbers on what % of Mac users use 5k displays (Apple, Samsung, LG, the new ASUS ProArt 5K, etc...) - that aren't old discontinued 27" iMacs? If we include old iMacs, 5k is more common than I credited (one sits on my desk, too), but that's a discontinued line and most people won't go through the ordeal to convert them to external displays. Until the ASUS came out, 5K displays tended to be nearly a grand and up. Many discussions of Mac Rumors feature people struggling with whether to go for an M4 Pro over a base M4, 24 gig RAM over 16, 512-gig SSD over 256, etc... I suspect to many of them, 4K 27" displays look pretty good.

"pretty good" that's the thing, though. Now we gotta talk about the actual personality and goals of Apple. They don't see themselves as a "pretty good" company. They make the premium product and this shows across the entire product line (to the point where everyone thought it was really bizarre and out of character when they made the iPod Shuffle).

Dedicated resources and increasing complexity of the OS for a subset of users who don't want to spend money for a better display makes no sense on any level to a company like this.

one last thing: companies like Apple look "where the ball is going." Regardless of what the display market is like today, the ones sold next year, and the year after that, will be incrementally better. Eventually 4k will become obsolete and whatever replaces it will be better. That's where Apple is going aim.
 
  • Like
Reactions: drrich2
one last thing: companies like Apple look "where the ball is going." Regardless of what the display market is like today, the ones sold next year, and the year after that, will be incrementally better. Eventually 4k will become obsolete and whatever replaces it will be better. That's where Apple is going aim.
That ball, particularly as it pertains to 4K displays, is pretty hard to anticipate these days.

The 4K 27" display is solidly mainstream today, from what I understand (U.S. perspective; I don't know about other nations). The 5K 27" display segment is much smaller (I went to Amazon to see their 27" 4K vs. 5K options; huge difference). The new ASUS ProArt 5K for around $800 brought the premium for 5K way down (and not just a sale price!). I struggled with that vs. an on-sale for $435 Dell U2723QE 4K 27" until Woot! offered the later 'open box quality' for (with tax and all) for me $320.50, so I went with that.

I kept thinking maybe 5K would supplant 4K in the 27" display, but not only has that not noticeably happened, a lot of people seem happy with 4K 32" displays. I haven't worked with one of those, on a Mac or Windows.

And 4K seems to be just getting to the 27" party where OLED is concerned, though I've read text quality isn't quite as good with the OLED route (and prices are substantially higher). I haven't heard of any 5K 27" OLED displays.

6K sounds like a shoe in for the 32" space, but the Dell I'm aware of is priced over 2 grand.

My suspicion is that 4K will continue to dominate the 27" display market for some time to come, 5K will gain a little ground on the Mac side, and OLED will be the main 'ball' more people are watching.

But if someone can get 32" 6K IPS Black brand name displays out there for (I'm just gonna pick a number) $1,200 or less, that could be a game changer.

P.S.: In light of the original question driving this thread, those of you using 4K displays who've had direct experience comparing 27" vs. 32" 4K displays, do you notice substantial difference in text sharpness? I wanted 32", but I wasn't willing to take that chance with 4K.
 
  • Like
Reactions: Eric_WVGG
one last thing: companies like Apple look "where the ball is going." Regardless of what the display market is like today, the ones sold next year, and the year after that, will be incrementally better. Eventually 4k will become obsolete and whatever replaces it will be better. That's where Apple is going aim.
Apple have been aiming at 5k since 2014 and the ball still hasn't got there yet - at least not in terms of gaining traction in the PC market. A bunch of third-party 5k displays from HP, Dell, Phillips launched back then but sank without trace, apart from the LG one which was promoted by Apple to Mac users.

27", 4k UHD at typical viewing distance puts the pixel size at the limit of 20:20 vision - and that applies to pixel-sized artefacts from fractional scaling, too. Many people will see more detail/sharpness on a 5k@27" screen, but you're in to diminishing returns - especially on Windows which (for better or worse) has freely adjustable UI scaling and so doesn't share the Mac's 220ppi "sweet spot" - 5k on PC is a bit nicer than 4k but probably not worth 2-3x the price. I think the other problem with 5k on PC was that it initially needed either two DisplayPort 1.2 cables or Thunderbolt 3 (and a compatible GPU). We'll see if the new, more affordable, Asus 5k makes inroads, now DP 1.4 and/or TB3 are more prevalent on PCs - I don't get the feeling that the Samsung has been flying off the shelves, though and I somehow doubt that the PC world will embrace 4k - I suspect these displays are still going to be mainly targeted at Mac users and will have to fight against the "look and feel" of the Studio Display.

Thing is - as with a lot of IT - technology is starting to catch up with demand and the mk1 eyeball, the days of struggling with flickery, 30Hz, 320x200 screens are long gone and 4k is now very close to "good enough" for most purposes. 5k is already a busted flush - and what talk there is is about 8k or HDR/HFR. I don't even see 8k breaking out as a mainstream tech for computer displays and traditional domestic TVs (4k TV is barely worth it below about 50") unless someone comes out with affordable, wall-sized screens or projection systems.

Instead - "the ball" has been going towards other refinements than simple resolution, like high dynamic range, improved colour gamuts, higher frame rates via OLED/QLED/microLED etc. These are starting to show up in 4k panels and smaller laptop or tablet screens, but I fear that Apple's attachment to 220ppi on the desktop is a bit of a stumbling block there - expensive "niche" panel sizes for tech like miniLED and OLED and huge bandwidth requirements (5k is already considerably more bandwidth than 4k, now double the frame rate...).

You can hardly call the Studio Display forward looking... It's, well, slightly brighter but otherwise the same tech as the 2017 iMac/LG Ultrafine and only an incremental improvement in gamut/surface treatment over the 2014 iMac. Pro XDR is already 5 years old.
 
P.S.: In light of the original question driving this thread, those of you using 4K displays who've had direct experience comparing 27" vs. 32" 4K displays, do you notice substantial difference in text sharpness? I wanted 32", but I wasn't willing to take that chance with 4K.

I'm fine with 4k @42". I see sharper text on my iPad and phone, but 100dpi looks fine to me at the distance I sit.
 
drrich wrote:
"I kept thinking maybe 5K would supplant 4K in the 27" display, but not only has that not noticeably happened, a lot of people seem happy with 4K 32" displays. I haven't worked with one of those, on a Mac or Windows."

What I'd like to see are 32" 5k displays.
But nobody makes a panel in that size/resolution, and doubt we'll ever see it.

I find "looks like 1440p" on a 27" display to be too small for my viewing comfort.
But on 32", for me it would look "just right".
 
I'm fine with 4k @42". I see sharper text on my iPad and phone, but 100dpi looks fine to me at the distance I sit.


I use the 42-43" Asus 4K at non-scaled res, works great. I previously used a 50" Samsung 4K and never had a complaint, but now that I'm with the Asus I can see how overly large the 50" pixels are. Eh, got used to it. I couldn't imagine working in less than 4k now simply for the content alone.


USB-C / TB4 to DisplayPort cable for full 144hz. Works perfectly, full 444 RGB.
 
Thing is - as with a lot of IT - technology is starting to catch up with demand and the mk1 eyeball, the days of struggling with flickery, 30Hz, 320x200 screens are long gone and 4k is now very close to "good enough" for most purposes. 5k is already a busted flush - and what talk there is is about 8k or HDR/HFR. I don't even see 8k breaking out as a mainstream tech for computer displays and traditional domestic TVs (4k TV is barely worth it below about 50") unless someone comes out with affordable, wall-sized screens or projection systems.

Instead - "the ball" has been going towards other refinements than simple resolution, like high dynamic range, improved colour gamuts, higher frame rates via OLED/QLED/microLED etc. These are starting to show up in 4k panels and smaller laptop or tablet screens, but I fear that Apple's attachment to 220ppi on the desktop is a bit of a stumbling block there - expensive "niche" panel sizes for tech like miniLED and OLED and huge bandwidth requirements (5k is already considerably more bandwidth than 4k, now double the frame rate...).
The discussion in this thread, looking at things people value (e.g.: resolution, refresh rate, dynamic range, gloss vs. matte) and discussion of how market forces such as consumer demand are driving where the 'ball is going' (what display features will be focused on and dominate the mainstream) has me thinking about some of those forces.

While a 5K display offers 5K for the interface and some on-system content, I suspect it would significantly enhance the appeal of 5K and 6K displays (and t.v.s) if there was a lot of 5 or 6K streaming content to watch. And therein lies the problem.

Given that many people rely on streaming these days, and both cable landline and satellite ISP service has some bandwidth limitations, can our content delivery infrastructure in the U.S. handle a large scale shift to 6 or 8K content? Does the U.S. Internet service provider infrastructure have enough capacity to deliver large scale streaming 6K content to the masses, on top of the other bandwidth-using Internet traffic already in use? Netflix, Disney+, Paramount+, Peacock, Max, Hulu, YouTube!, if most of these start streaming a lot of 6 or 8K content, is this going to create problems? What about for Dish Network and DirectTV satellite services?

Some years back a common lament with cable modems was that service was slower at times of day (e.g.: evenings) when more people got off work and used the service due to bandwidth bottlenecks. Let's say for sake of argument 25% of the U.S. streaming consumption shifted to 6K video by 2028. Does anyone have some ballpark idea of how much of a problem that would cause?

Also, there are a couple of directions that ball could go. If you had a choice amongst 27" displays between a 5K 60-Hz display and a 4K 120-Hz display, which would you pick?
 
  • Like
Reactions: eltoslightfoot
"...for sake of argument 25% of the U.S. streaming consumption shifted to 6K video..."

Where are you going to get 10,000,000+ hours of 6K properly produced content from?
From GoPros or other YT influencer's DSLR rigs?
You might be getting big shiny 5K/6K logos on the sides of the camera gear, but they will output video with compression that throws away a good proportion of the resolution data to get manageable storage amounts at manageable data rates...
Look at the huge rigs that Apple use to get proper definition video out of their iPhones, or the even bigger rigs the cinema world uses. Unless it's for iMax projection then 4K is 'good enough' for a lot of productions.

Streamed video is the last place you need >4K video.
That's why 4K 240+Hz rules in the gaming world, unless your producing a feed for a Times Square billboard...

Where 5K and greater is needed is for static vector graphics, text etc. And if you can read text or look at vector graphics that changes pages at more that 120Hz, then you are a Falcon, or other bird of prey. ;)
 
Last edited:
  • Like
Reactions: drrich2 and Altis
The discussion in this thread, looking at things people value (e.g.: resolution, refresh rate, dynamic range, gloss vs. matte) and discussion of how market forces such as consumer demand are driving where the 'ball is going' (what display features will be focused on and dominate the mainstream) has me thinking about some of those forces.

While a 5K display offers 5K for the interface and some on-system content, I suspect it would significantly enhance the appeal of 5K and 6K displays (and t.v.s) if there was a lot of 5 or 6K streaming content to watch. And therein lies the problem.

Given that many people rely on streaming these days, and both cable landline and satellite ISP service has some bandwidth limitations, can our content delivery infrastructure in the U.S. handle a large scale shift to 6 or 8K content? Does the U.S. Internet service provider infrastructure have enough capacity to deliver large scale streaming 6K content to the masses, on top of the other bandwidth-using Internet traffic already in use? Netflix, Disney+, Paramount+, Peacock, Max, Hulu, YouTube!, if most of these start streaming a lot of 6 or 8K content, is this going to create problems? What about for Dish Network and DirectTV satellite services?

Some years back a common lament with cable modems was that service was slower at times of day (e.g.: evenings) when more people got off work and used the service due to bandwidth bottlenecks. Let's say for sake of argument 25% of the U.S. streaming consumption shifted to 6K video by 2028. Does anyone have some ballpark idea of how much of a problem that would cause?

Also, there are a couple of directions that ball could go. If you had a choice amongst 27" displays between a 5K 60-Hz display and a 4K 120-Hz display, which would you pick?
I literally picked the 4k 144Hz option. :)

I do agree with you though. I haven't even bothered updating my two 1080P Panasonic Plasma TVs for this reason. The best I bother to pay for is usually 720P (Netflix) or 1080P (most everything else). Definitely not forking over the cash for marginal improvement.
 
  • Like
Reactions: drrich2
I was quite concerned that my 27" 4k (Dell U2723QE) was not going to look right with macOS due to their janky scaling.

The default setting "looks like 1080p" has integer scaling so I left it at that and turned down the text size. Made the dock smaller and adjust a few things in apps, and it all looks great. A few things are a bit larger than they need to be but that's better than being too small and fatiguing on the eyes.

Best part is this monitor has KVM so I can boot either the Mini or my old PC and everything works right away.
 
While a 5K display offers 5K for the interface and some on-system content, I suspect it would significantly enhance the appeal of 5K and 6K displays (and t.v.s) if there was a lot of 5 or 6K streaming content to watch.
I don't think there's much demand for 5k or 6k video content. Even 4k is a long way from universal on the streaming services. 8k may be a niche for people with serious home cinema setups (and fibre-to-the-premises internet) with 100+" displays (and offer better scaling between 4k and 8k).

What a 5k or 6k display on a computer will do is allow you to edit 4k at 1:1 scaling in a window with space for controls, timelines etc. around it.
 
Streamed video is the last place you need >4K video.
I get your reasoning. I recall when standard DVD resolution was impressive, then 1080 interlaced was a big, big deal, then 1080p t.v.s with matching content if you really cared about quality picture, and so on...and now we have 4K.

Companies want to stand out as innovators offering improvements. Not unlike way back when movie theaters wanted something to differentiate themselves from then then fairly new home t.v. industry, so they went widescreen.

Then there are the customers who crave the high end offerings - connoisseurs, elitists, so rich the price difference for even marginal improvement is trivial, etc..., to form a market.

I see Amazon has 8K t.v.s; a glance suggest they tend to be in the 65" to 85" size range.

And of course, people want matching content to take advantage of their big purchase. Content providers then get a chance to shine (and maybe up charge).

Over time, the cost difference to manufacture a higher res. t.v. is marginal enough they go more mainstream, and customers are willing to pay a little more even if they might not notice much difference (e.g.: some modest size t.v.s bought at 4K instead of 1080).

I don't think there's much demand for 5k or 6k video content.
And maybe there never will be. Perhaps 4K is where things top out many years into the future. None of my points above prove a perpetual trend going forward.
 
I get your reasoning. I recall when standard DVD resolution was impressive, then 1080 interlaced was a big, big deal, then 1080p t.v.s with matching content if you really cared about quality picture, and so on...and now we have 4K.

Companies want to stand out as innovators offering improvements. Not unlike way back when movie theaters wanted something to differentiate themselves from then then fairly new home t.v. industry, so they went widescreen.

Then there are the customers who crave the high end offerings - connoisseurs, elitists, so rich the price difference for even marginal improvement is trivial, etc..., to form a market.

I see Amazon has 8K t.v.s; a glance suggest they tend to be in the 65" to 85" size range.

And of course, people want matching content to take advantage of their big purchase. Content providers then get a chance to shine (and maybe up charge).

Over time, the cost difference to manufacture a higher res. t.v. is marginal enough they go more mainstream, and customers are willing to pay a little more even if they might not notice much difference (e.g.: some modest size t.v.s bought at 4K instead of 1080).


And maybe there never will be. Perhaps 4K is where things top out many years into the future. None of my points above prove a perpetual trend going forward.
The problem is that the download sizes of 8k media for streaming would be absolutely prohibitive. It would be like 80 GB for a single movie. So while the TV companies want it, it ain't happening unless our ISPs increase bandwidth accordingly.
 
  • Like
Reactions: drrich2
The problem is that the download sizes of 8k media for streaming would be absolutely prohibitive. It would be like 80 GB for a single movie. So while the TV companies want it, it ain't happening unless our ISPs increase bandwidth accordingly.
You can already stream 8K now. Make sure you change the res to 8K.

Runs fine on my not very impressive 100Mbps connection.
 
  • Like
Reactions: drrich2
Speaking of "is 4K ok?"

Honestly, the monitor I would buy if I needed one right now would be any of the 4k 120hz OLED 32" options

They are PHENOMENAL

Scaled...schmaled....
I'll take 4k high refresh rate OLED all day, any day...

Speaking of which .. $899 .. holy cow


Or $699! That's 165hz, 4K & QD-OLED -- INSANE

I like it. Definitely, a massive upgrade from what I was using.

MSI_MAG-321UP_Mac.jpg

From a few inches away:

MSI_MAG-321UP_closeup_macOS-15_HiDPI.jpg
 
You can already stream 8K now. Make sure you change the res to 8K.

Runs fine on my not very impressive 100Mbps connection.
Of course it does. It isn’t speed that’s the issue in most cases, it’s raw data passing over the wire. We would have to all pay for those increases. And it won’t be cheap. I mean, 20 hours of streaming would be 1.6 Terabytes. Twenty. Hours. of. Streaming.
 
  • Like
Reactions: thoroc
Of course it does. It isn’t speed that’s the issue in most cases, it’s raw data passing over the wire. We would have to all pay for those increases. And it won’t be cheap. I mean, 20 hours of streaming would be 1.6 Terabytes. Twenty. Hours. of. Streaming.
8K streaming doesn't use that much bandwidth...I wouldn't be able to stream 8K content at all if that was the case.

8K streaming needs like 50Mbps that's a little over 20GBs an hour.
 
8K streaming needs like 50Mbps
Yeah, 4x the bandwidth needed for 4k.

The point is not that "nobody wants 8k" - it's that there are a lot of chicken-and-egg situations to resolve before 8k becomes "the new 4k" in terms of mass uptake.

My FTTC home broadband is currently giving 40Mbps. I'm lucky in that I do have the option of upgrading to full fibre - many people don't. Bear in mind that you're going to need a bit of headroom over 50Mbps to reliably stream 8k and families will need to support multiple users over that one connection.

I have a fairly large 55" 4k TV and even that size is borderline when it comes to actually doing justice to 8k. Anyhow, half of the material on streaming services isn't even 4k yet, let alone 8k (and I don't just mean old TV shows which don't exist at high def). Not a great incentive to buy a new, expensive TV and increase my monthly broadband charge.

(Edit: bps brain misfire deleted)
 
Last edited:
  • Like
Reactions: thoroc and drrich2
Yeah, 4x the bandwidth needed for 4k.

The point is not that "nobody wants 8k" - it's that there are a lot of chicken-and-egg situations to resolve before 8k becomes "the new 4k" in terms of mass uptake.

My FTTC home broadband is currently giving 40Mbps. I'm lucky in that I do have the option of upgrading to full fibre - many people don't. Bear in mind that you're going to need a bit of headroom over 50Mbps to reliably stream 8k and families will need to support multiple users over that one connection.

I have a fairly large 55" 4k TV and even that size is borderline when it comes to actually doing justice to 8k. Anyhow, half of the material on streaming services isn't even 4k yet, let alone 8k (and I don't just mean old TV shows which don't exist at high def). Not a great incentive to buy a new, expensive TV and increase my monthly broadband charge.


50 Mbps (M bits) x8 = 400 MBps (bytes)
x60 = 24,000 (MB per minute)
x60 = 1,440,000 (MB per hour)
= 1440 GB per hour.
Um no. You got that backwards. It's 8 bits in a byte. You divide by 8 in the first step.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.