Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is true for things like most games or content that doesn't look like retina resolution, like low res pixel based graphics. Other than that, no scaling in that sense takes place since things are rendered at the screens native resolution.
If you play a lot of games, this is definitely something to consider, but other than that, I hardly encounter such content anymore.

I'm not sure that's all true. My wife's rMBP looks less critically sharp not running at native/retina native resolution. It's really obvious to my eyes.
 
I'm not sure that's all true. My wife's rMBP looks less critically sharp not running at native/retina native resolution. It's really obvious to my eyes.

If content wouldn't get rendered at the native resolution, it would defeat the main purpose of Retina screens.
If you are talking about things like Finder windows, this shouldn't happen (unless you somehow managed to install an OS like Snow Leopard or older). You could take a picture and check if you can see if straight lines actually have blurred edges.
 
Practically speaking, I don't see (so to speak) a 16K display as a likelihood, and 8K is going to be a long way off. The market is not likely to support it, since the vast majority of users won't be able to distinguish between 4k and 8k, no less 12K or 16K. A 16K display would have over 140 million pixels, and be somewhere in the range of 720 ppi. That's a lot of pixels for a manufacturer to get "right" (how many dead/bright pixels would be acceptable?), a lot of graphics chip capability... and streaming media content in native 8K-16K is going to demand a whole lot more internet bandwidth.

It's all an illusion anyway. Nearly all media is consumed via lossy data reduction schemes like JPG, MP-3, MP-4, and few consumers are the wiser. Do we really need ultra-high-resolution systems at the end of a "good enough" pipeline?

It's not about what someone can discern when cranking the volume to 11, or peering through a magnifying glass. It never has been. It's always been about the minimum thresholds of detection by an average person under typical conditions (24-30 frames per second for smooth motion, 1 decibel as the minimum detectable difference in audio volume, etc.).

Sure, Moore's Law will keep chugging along. At some point we'll transition to 128- or 256-bit microprocessors, too. 3TB primary storage will seem quaint...

No, display resolution is going to plateau, like the 300 dpi offset printing plate, the 24-36 MP DSLR sensor, and 20 Hz-20 KHz audio response - there are simply better ways to spend money and resources.
 
  • Like
Reactions: shaunp
I was joking when I said send it back to Apple. I honestly think this is a wind-up. To be able to see the pixels on a 27" 4K display I have to hold my iPhone as close as it will focus - around 2" - then either take a photo and zoom in or zoom in with the camera to magnify this so it at least twice the size of the original, possibly more.


Bah! An iPhone! . Here's what a macro lens does!
DSC_6443.jpg


(cropped). Clearly, this is unacceptable. You can see every pixel!
 
  • Like
Reactions: shaunp
Last year, I was experiencing some neck pain and eye strain. Went in for an eye appointment, which I haven't done in forever b/c my vision is great. Turns out I am farsighted, despite seeing well far and near. While I can see in the medium-near distance, my eyes strain more to do it. I never noticed this, but it was causing me to move closer to the screen when reading text (sounds counter-intuitive). This poor posture (curvature of the spine, sharp angle of the neck, etc) from leaning in and being closer to the monitor, is what caused the neck pain. So I got some "monitor glasses", which are basically just light prescription reading glasses. Now I sit at a normal distance and don't feel an urge to "move closer to the screen" when reading.

As others have bluntly said, you're doing it wrong... especially for a screen that large. Be it monitors or books, you shouldn't have to put your nose in it. Perhaps you need to see an eye doc like I did?
 
  • Like
Reactions: Samuelsan2001
Please tell me this thread is a joke. Even getting a few inches away the screen is still amazingly sharp.

Sure, wait for 8k, Don't be surprised when it take 5 years for it to come out. 5k is here to stay for at least 5 years.
 
Absolutely you can do that. And I have. And I hated it. It looks terrible to ME. It's excellent scaling and mapping, but it's still scaling, and the drop in clarity is blindingly obvious. My wife doesn't mind running her 13" rMBP this way, but I do.

Have you use one of the new MacBooks? It comes by default at a non-native resolution and everyone I've heard from says they can't tell a difference between native and scaled. Just curious if you've used one and can tell a difference.

If content wouldn't get rendered at the native resolution, it would defeat the main purpose of Retina screens.
If you are talking about things like Finder windows, this shouldn't happen (unless you somehow managed to install an OS like Snow Leopard or older). You could take a picture and check if you can see if straight lines actually have blurred edges.

Can you do some comparison photos of straight lines with native scaling and non-native scaling on a retina display? I don't have a retina mac and am curious about the results.
 
Last edited:
Can you do some comparison photos of straight lines with native scaling and non-native scaling on a retina display? I don't have a retina mac and am curious about the results.

I can do it next weekend. Now I just have an iPhone 6 camera, not ideal for showing pixels of a retina screen.
 
Have you use one of the new MacBooks? It comes by default at a non-native resolution and everyone I've heard from says they can't tell a difference between native and scaled. Just curious if you've used one and can tell a difference.

I haven't used one except at a glance.
 
If content wouldn't get rendered at the native resolution, it would defeat the main purpose of Retina screens.
If you are talking about things like Finder windows, this shouldn't happen (unless you somehow managed to install an OS like Snow Leopard or older). You could take a picture and check if you can see if straight lines actually have blurred edges.

They do have slightly blurred edges. I can see it with my own eyes. Running a 13' rMBP at 1680x1050 is obviously not as sharp to me compared to native 1280x800 retina.
 
Bah! An iPhone! . Here's what a macro lens does!View attachment 603381

(cropped). Clearly, this is unacceptable. You can see every pixel!

Edit.

Sorry, I've just realised that you are being sarcastic - too early for me to pick up on that. ;o)

At least I hope you are being sarcastic - you do know your eyes can't zoom in like that and normal people simply can't see the pixels on a 4K or 5K screen?
 
Apple's retina conceit is based on the idea that users view their devices at different distances--very close for a cell phone, somewhat further away for a tablet or laptop, and further still for a 27 inch imac.

In other words, if you can't see the entire desktop from where you're sitting, you're probably too close to the screen. You shouldn't have to move your head while operating the computer.

There's still antialiasing on a retina screen, it's just very difficult for the average person to see. to give a sense of scale, LETTER FROM EUROPE is around 3mm high (about 1/8 inch)
 
Last edited:
  • Like
Reactions: shaunp
you might have a bad iMac or your vision is impaired causing you to see pixels
 
is the OP some sort of scanner-visioned-alien type of guy? There's no way you can see the pixels from a usable distance point of view. You definitely should sit further away from your iMac.

Are you sure you got the Retina iMac?
 
  • Like
Reactions: imanidiot
Thinking about it, I think after a couple of months of usage my eyes have adjusted and the screen is just not as sharp as before.

Still the best screen in the world by a mile, but still, 5k is obsolete already. I think Apple should move to 16k and just leave it there for a decade or so and not worry about it. Especially if they want to release 32" or higher displays.
 
How can you see pixels if you're staring that close? All I see is blurry words. Are you sure the pixels aren't from the programs or a design in them?

If not, you've got better eyesight than the percentage of us here, that or a defect.
 
How can you see pixels if you're staring that close? All I see is blurry words. Are you sure the pixels aren't from the programs or a design in them?

If not, you've got better eyesight than the percentage of us here, that or a defect.


My eyesight is not great.

I'm not saying i can see every pixel, i'm just talking about how text looks jagged starting from around 12 inches away.

But, as time goes on, the screen is becoming less sharp even from normal distance. Its like when i first saw a 2k monitor, it looked crystal clear, now a 2K is awful beyond words.

5K is a massive improvement but more improvement is needed.
 
First of all i love the 5k screen on the new iMac. Easily the best screen i have ever seen.

However, Its still blocky when reading close up.

I like to put my elbows on the computer desk table when i'm reading emails and browsing internet etc, so i'm quite close to the screen, and the pixels are really noticeable.

Will 8k solve this issue, or would a screen have to be 16k to have a pixel free viewing experience?
This article should have come with a trigger warning: clickbait!

I have perfect vision and 5K is superb, i can clearly see the advantages over my old 2k/1440p screen -
i do nothing think 8k will make the difference, especially since most content is very seldom this high, in context of resolution. What is the point, there is such dearth of 4k, let alone 5or8k. Even 1080p is still not universal for every context - i.e. short video clips/graphics. I certainly say a better GPU and other improvements on screen such as OLED, HDR, TDM will bring far more benefits than an arbitrary jump to 5k.
 
This article should have come with a trigger warning: clickbait!

I have perfect vision and 5K is superb, i can clearly see the advantages over my old 2k/1440p screen -
i do nothing think 8k will make the difference, especially since most content is very seldom this high, in context of resolution. What is the point, there is such dearth of 4k, let alone 5or8k. Even 1080p is still not universal for every context - i.e. short video clips/graphics. I certainly say a better GPU and other improvements on screen such as OLED, HDR, TDM will bring far more benefits than an arbitrary jump to 5k.


The point is, the most basic of element; text, is not razor sharp.

Do a test. Stare at the text on a 5k screen for a while. It's blocky.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.