Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Falhófnir

macrumors 603
Aug 19, 2017
6,146
7,001
In your previous post you said "They just upped the resolution by twice what the MacBooks originally had", so I calculated the ppi's for the the original MacBooks. But now you're saying that it was a MacBook after a redesign. I'm not sure what redesign you're referring to, since there have been multiple generations, so you could you please give me the year, size and model?
The change from 110ppi to 127ppi UI scaling happened in 2016, when they changed the default resolution from 1440x900 to 1680x1050 (15" model). They then kept that 127ppi the same when they moved to 16" in 2019, just added resolution as the display got larger, so you got a 'looks like' resolution of 1792x1120, but the physical resolution remained a non-integer multiple of that (3072x1920). With the latest model, again it's roughly 127ppi, this time 1728x1117 due to the slightly different shape and size of the display, but the difference is the physical resolution is finally restored to exactly 4x that - 3456x2234.
 
  • Like
Reactions: mopatops

tornado99

macrumors 6502
Jul 28, 2013
454
445
You know, everyone else who's chimed in thus far has tried be civil, collegial, and helpful. Then you come lumbering in with statements that are by turns obnoxious, ignorant, and lazy. But hey, the internet, right?

Let's take two in particular:

1) "I'm not buying it....I think you're saying that to justify a purchase." Well that's definitely obnoxious. The fact of the matter is that I'm not actively considering purchase of an MBP and, even if I were, I wouldn't buy it for the display. That's because when my main computer was an MBP it spent 95% of its time on a desk, with me viewing it mostly through larger external monitors. My main interest is in upgrading those. Thus we can add that your statement is ignorant as well.

2) "If there was also a way of blind-testing displays I expect around 170 ppi would be the threshold for a desktop monitor." OK, that's just lazy. Why? Because I linked a double-blind study addressing exactly that, which you didn't bother to read before pontificating. Quoting from it:

"This study determines an upper discernible limit for display resolution. A range of resolutions varying from 254–1016 PPI were evaluated using simulated display by 49 subjects at 300 mm viewing distance. The results of the study conclusively show that users can discriminate between 339 and 508 PPI and in many cases between 508 and 1016 PPI." [1]

300 mm is 59% of 20", so those correspond to being able to distinguish 200 ppi vs. 300 ppi, and 300 ppi vs 600 ppi, at 20", respectively (same angular resolutions). And for those like me who often lean in closer, to a 15" viewing distance, it's 270 ppi vs. 400 ppi, and 400 ppi vs. 800 ppi.

Of course, this is just one study. But there are others that support >220 ppi as well. I'd put more stock in what they say, plus my own experience, than some rando on the internet who throws around claims about things he just doesn't understand.

[1]Spencer, Lee, et al. "Minimum required angular resolution of smartphone displays for the human visual system." Journal of the Society for Information Display 21.8 (2013): 352-360.

Except for the fact that fonts are not composed of straight lines, but rasterised vectors. Your claim was specifically that text is sharper at 254 ppi vs 220 ppi. None of those studies back up your claim that a difference of 34 ppi at 20 inch is noticeable.
 
Last edited:
  • Like
Reactions: AlixSPQR

jav6454

macrumors Core
Nov 14, 2007
22,303
6,264
1 Geostationary Tower Plaza
Nope. The Retina MacBook was 226 ppi. See: https://en.wikipedia.org/wiki/Retina_display

The only 127 ppi Mac I've been able to locate is a 2016 MacBook Air (non-Retina), found from among this long list of all of Apple's 2016 displays: https://www.theverge.com/tldr/2016/...d-screen-sizes-pixels-density-so-many-choices

I find it hard to believe that, in 2016, during the Retina era, Apple would build the rendering specs for its OS around the display on a lone non-Retina Air.
Again... Apple didn't choose one PPI, it just became that due to doubling the resolution. If Apple were to have chosen a PPI, they would have wonky resolutions.
 

w5jck

Suspended
Nov 9, 2013
1,516
1,934
I'm not going to speculate, nor join the arguments herein, but I do find it interesting that Apple has used multiple ppi for MacBooks over the years, including two different ones for the current lineup of MacBooks with M1. And even more bazaar is that the MacBook 14 and 16 M1s use screen ratios that are unique, 16x10.39 and 16x10.34, while the 13 models use the more traditional 16x10 ratio like the older 15 did.

  • MacBook Pro 15 2014: 16x10, 2880x1800 px, 220 ppi
  • MacBook Air M1: 16x10, 2560x1600 px, 227 ppi
  • MacBook Pro 13 M1: 16x10, 2560x1600 px, 227 ppi
  • MacBook Pro 14 M1: 16x10.39, 3024x1964 px, 254 ppi
  • MacBook Pro 16 M1: 16x10.34, 3456x2234 px, 254 ppi
Also, note these stats:
  • The MacBook Pro 15 2014 is 1.5x and 1.5x the resolution of 1920x1200. Makes sense.
  • The MacBook Air and Pro 13 M1s are 1.333x and 1.333x the resolution of 1920x1200. Makes sense.
  • The MacBook Pro 14 M1 is 1.575x and 1.637x the resolution of 1920x1200. Bazaar!
  • The MacBook Pro 16 M1 is 1.80x and 1.86x the resolution of 1920x1200. Bazaar!
It is hard to see any rhyme or reason in all this, but maybe it is there somewhere, or maybe the Apple dev teams are smoking too much of something from the special Apple orchard. :cool:
 

w5jck

Suspended
Nov 9, 2013
1,516
1,934
And one more thing, good luck trying to figure out what the hell Apple means by wide color gamut! Is it true 10-bit per channel for the MacBook screens? Nope, because that would mean 1 billion colors and they list the screens as merely millions of colors (8-bit per channel = 16.7 million). Other than knowing that they do some razzle-dazzle to extend the 8-bit colors beyond 16.7 million, or at least trick our brains into thinking they did, their is no clue as to how many colors we can expect on these screens. It isn't true 10-bit HDR though, but I guess it looks okay for 8-bit + a few more colors. We do know Display P3 has more colors than sRGB, thus a "wider color gamut", but Apple is mum on specific details. At least we see less color banding than on 8-bit monitors. Apple-speak is as creepy and misleading as lawyer-speak, which it basically is...
 
  • Like
Reactions: kvic

leman

macrumors Core
Oct 14, 2008
19,521
19,677
I'm not going to speculate, nor join the arguments herein, but I do find it interesting that Apple has used multiple ppi for MacBooks over the years, including two different ones for the current lineup of MacBooks with M1.

All of these things are fairly clear when you look how Apple display technology has developed over the years. It's a natural evolution of how they map UI/text size to the display and then improve the visual fidelity as technology matures.

And even more bazaar is that the MacBook 14 and 16 M1s use screen ratios that are unique, 16x10.39 and 16x10.34, while the 13 models use the more traditional 16x10 ratio like the older 15 did.

Not bizarre at all. That's the extra space added by the display alongside the notch.

And one more thing, good luck trying to figure out what the hell Apple means by wide color gamut! Is it true 10-bit per channel for the MacBook screens? Nope, because that would mean 1 billion colors and they list the screens as merely millions of colors (8-bit per channel = 16.7 million). Other than knowing that they do some razzle-dazzle to extend the 8-bit colors beyond 16.7 million, or at least trick our brains into thinking they did, their is no clue as to how many colors we can expect on these screens. It isn't true 10-bit HDR though, but I guess it looks okay for 8-bit + a few more colors.

Again, no mystery here at all. Wide color gamut for Apple is stuff beyond the usual sRGB. It encompasses both increased range of representable colors (Apple settled for DCI-P3 for various reasons) as well as increased dynamic range (aka HDR). Apple is using a neat system for working with wide color they call EDR: colors are encoded like in sRGB but the color values can go beyond 1.0 to represent intensity that normal sRGB displays cannot show.

We do know Display P3 has more colors than sRGB, thus a "wider color gamut", but Apple is mum on specific details. At least we see less color banding than on 8-bit monitors. Apple-speak is as creepy and misleading as lawyer-speak, which it basically is...

There is is plenty of developer resources, presentation, code samples etc. provided by Apple on this very topic. No idea why you think that their documentation is creepy or misleading?

Some stuff to get you started if you are interested: Explore HDR rendering with EDR - WWDC21 - Videos - Apple ...https://developer.apple.com › videos › play › wwdc2021
 

Tagbert

macrumors 603
Jun 22, 2011
6,256
7,281
Seattle
I don’t think any major OS uses vectorised graphics as it’s internal graphics engine.

When upscaling if I’m not wrong anti-aliasing will be done on the upscaled bitmap and likely some other image enhancing techniques, producing a higher quality base to down sample from, so it shouldn’t cause too much of artifacts unless of course the target resolution causes loss of fidelity.
Certainly when Mac OS does the initial render on that 127pt frame buffer, many of the ui components are vector (fonts, icons, some graphic elements). Then that frame buffer is scaled to match the display. Apps use a lot of PDF images for icons and graphics.
 

Tagbert

macrumors 603
Jun 22, 2011
6,256
7,281
Seattle
I'm not going to speculate, nor join the arguments herein, but I do find it interesting that Apple has used multiple ppi for MacBooks over the years, including two different ones for the current lineup of MacBooks with M1. And even more bazaar is that the MacBook 14 and 16 M1s use screen ratios that are unique, 16x10.39 and 16x10.34, while the 13 models use the more traditional 16x10 ratio like the older 15 did.

  • MacBook Pro 15 2014: 16x10, 2880x1800 px, 220 ppi
  • MacBook Air M1: 16x10, 2560x1600 px, 227 ppi
  • MacBook Pro 13 M1: 16x10, 2560x1600 px, 227 ppi
  • MacBook Pro 14 M1: 16x10.39, 3024x1964 px, 254 ppi
  • MacBook Pro 16 M1: 16x10.34, 3456x2234 px, 254 ppi
Also, note these stats:
  • The MacBook Pro 15 2014 is 1.5x and 1.5x the resolution of 1920x1200. Makes sense.
  • The MacBook Air and Pro 13 M1s are 1.333x and 1.333x the resolution of 1920x1200. Makes sense.
  • The MacBook Pro 14 M1 is 1.575x and 1.637x the resolution of 1920x1200. Bazaar!
  • The MacBook Pro 16 M1 is 1.80x and 1.86x the resolution of 1920x1200. Bazaar!
It is hard to see any rhyme or reason in all this, but maybe it is there somewhere, or maybe the Apple dev teams are smoking too much of something from the special Apple orchard. :cool:
I believe that the 16x10.39 ratio is for the overall screen but with the menu in the notch, the safe area below the notch is kept at a full 16x10. That was part of the argument about why the notch was not an intrusion into the display but was a case of moving the menu out of the display into the bezel.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Certainly when Mac OS does the initial render on that 127pt frame buffer, many of the ui components are vector (fonts, icons, some graphic elements). Then that frame buffer is scaled to match the display. Apps use a lot of PDF images for icons and graphics.

It's a combination with vector and raster rendering, where vector rendering usually utilises the 3D pipeline. Details can get fairly complex quickly. For example, while fonts are vector data, they are often pre-rendered to bitmaps and then composited as bitmap images since it's more efficient.
 
  • Like
Reactions: Tagbert

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
And one more thing, good luck trying to figure out what the hell Apple means by wide color gamut! Is it true 10-bit per channel for the MacBook screens? Nope, because that would mean 1 billion colors and they list the screens as merely millions of colors (8-bit per channel = 16.7 million).
They don't list them as millions though? For example, 14" and 16" M1 MacBook Pros are listed on Apple's website as supporting: "Color: 1 billion colors".

Also, 10 bits per channel is not the same thing as wide color gamut, despite what you may have heard (some popular explainers conflate the two). More bits provides finer graduations of the range, but wide color gamut means we're increasing the range itself. A P3 wide gamut display should be capable of generating a fundamentally redder red (by which I mean longer wavelength, red being at the longwave end of the visual spectrum) than a display limited to sRGB. This holds true even when comparing a 10bpc sRGB display to a 8bpc P3 display.

Other than knowing that they do some razzle-dazzle to extend the 8-bit colors beyond 16.7 million, or at least trick our brains into thinking they did, their is no clue as to how many colors we can expect on these screens. It isn't true 10-bit HDR though, but I guess it looks okay for 8-bit + a few more colors. We do know Display P3 has more colors than sRGB, thus a "wider color gamut", but Apple is mum on specific details. At least we see less color banding than on 8-bit monitors. Apple-speak is as creepy and misleading as lawyer-speak, which it basically is...
You know that things like P3 and sRGB are industry standards, right? It's perfectly fine and normal for Apple to reference standards without dumping a complete copy of the standard into their web page.

The only "creepy" thing here is what you're doing, to be honest. You're very confused about several topics, and you're blaming Apple for your confusion even though their communications on this are reasonably clear.
 

tornado99

macrumors 6502
Jul 28, 2013
454
445
They don't list them as millions though? For example, 14" and 16" M1 MacBook Pros are listed on Apple's website as supporting: "Color: 1 billion colors".

Also, 10 bits per channel is not the same thing as wide color gamut, despite what you may have heard (some popular explainers conflate the two). More bits provides finer graduations of the range, but wide color gamut means we're increasing the range itself. A P3 wide gamut display should be capable of generating a fundamentally redder red (by which I mean longer wavelength, red being at the longwave end of the visual spectrum) than a display limited to sRGB. This holds true even when comparing a 10bpc sRGB display to a 8bpc P3 display.


You know that things like P3 and sRGB are industry standards, right? It's perfectly fine and normal for Apple to reference standards without dumping a complete copy of the standard into their web page.

The only "creepy" thing here is what you're doing, to be honest. You're very confused about several topics, and you're blaming Apple for your confusion even though their communications on this are reasonably clear.

You're both right, and wrong. Modern 10-bit displays almost always go hand-in-hand with better phosphers on the LEDs able to represent a wider gamut of colours (although yes there are a few 10-bit sRGB too).

A lot of the credit here has to go to the panel manufacturers who have figured out the chemistry, not Apple!
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
The more boring answer is that Apple’s been using a scaled resolution on their MBPs by default for years now. Using a 110 pts/inch display (@2x is 220px/inch) but rendering at 127 pts/inch. The new displays are also 127 pts/inch (@2x is 254px/inch), making the default screen resolution native instead of scaled.
It’s simple, really. Apple used the resolution of 1440x900 for a long while on their large laptop line, and the HiDPI resolution of that is 2880x1800. But at some point, with world moving towards FullHD on compact laptop, that resolution was dated, so they moved the default to 1680x1050, but that would involve a slight loss of visual fidelity since the super sampled 1680x1050 didnt exactly map to the hardware resolution. Now with the M1 model they have increased the PPI to improve the fidelity. That’s it.

The change from 110ppi to 127ppi UI scaling happened in 2016, when they changed the default resolution from 1440x900 to 1680x1050 (15" model). They then kept that 127ppi the same when they moved to 16" in 2019, just added resolution as the display got larger, so you got a 'looks like' resolution of 1792x1120, but the physical resolution remained a non-integer multiple of that (3072x1920). With the latest model, again it's roughly 127ppi, this time 1728x1117 due to the slightly different shape and size of the display, but the difference is the physical resolution is finally restored to exactly 4x that - 3456x2234.

OK, I think I have it now. Let me give it one more shot :):

For most displays, both Apple and non-Apple, Apple's current default is integer scaling. That's because non-integer scaling causes some loss of sharpness. For instance, here's the default scaling for the four displays I currently use (by "UI" I mean what Apple calls "looks like"; and "scaling" effectively means "UI magnification relative to native"):

2014 MBP, 2880 x 1800 native, 1440 x 900 UI => 2x scaling
2019 iMac, 5120 x 2880 native, 2560 x 1440 UI => 2x scaling
Dell 27" 4k, 3840 x 2160 native, 1920 x 1080 UI => 2x scaling
Dell 24" WUXGA, 1920 x 1200 native, 1920 x 1200 UI => 1x scaling

However, exceptionally, for the displays on the 2016 and later large MBP's, Apple decided it wanted to shrink the UI from 2x magnification to ≈1.7x magnification to get more info. on the screen. [They may have done this for other laptops in their lineup as well, IDK.] Specifically using the 15" models as an example, they achieved a 1.7x magnification by switching their default UI from 1440 x 900 to 1680x1050, which works out to 2880/1680 = 12/7 ≈1.71x scaling. This non-integer scaling caused a loss of sharpness, but Apple decided they preferred that trade-off.

The reason they've moved to 254 ppi in the M-series MBPs is it allows them to get the smaller UI they desire, while being able to employ the preferred integer scaling. Specifically, 2x scaling on 254 ppi will yield the same UI size as what would be obtained with 221/254*2 = 1.74x scaling on the 221 ppi 15" MBP, i.e., only slightly larger than the default 1.71x.

Thus ppi does matter if you have a target UI size, and want to maintain integer scaling.

Given the above, does this suggest to you that Apple might want to move all its high-end external displays to 254 ppi?

Initially I thought it did. I.e., I boldly (foolishly?) predicted that the 27" mini-LED/ProMotion(?) display rumored for the end of 2022 will be 254 ppi instead of 218 ppi (that would provide some additional product differentiation to help justify higher pricing vs. the 27" Studio Display), and that the rumored 7k XDR replacement will be 32" (254 ppi) instead of 36" (218 ppi).

However, upon further consideration, I'm wondering if it's only for the smaller (laptop) displays that Apple thinks this lower UI magnification is needed. After all, there's not as much need to shrink the UI for the larger displays, since they're much less space-constrained (and would also typically have a longer viewing distance).
 
Last edited:

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Your claim was specifically that text is sharper at 254 ppi vs 220 ppi. None of those studies back up your claim that a difference of 34 ppi at 20 inch is noticeable.
Nice try, but no. Your attempt at a put-down (that I was just 'making stuff up' to justify a purchase) wasn't based on claiming that the Δppi was too small to be perceivable; it was based on claiming that the threshold for perceivable improvement, for most viewers, was below 220 ppi, and likely ≈170 ppi. [Re-read your post.] But after I presented evidence debunking your claim, rather than being an adult and acknowledging you got it wrong, you're trying to hide that by pretending you were claiming something different.

Yes, I am being hard on you, but what do you expect when you start a conversation with a baseless ad hominem? You've poisoned the well here, so there's no point in you continuing on this thread. But maybe try not doing that to the OP of the next thread you join, eh?
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,677
For instance, here's the default scaling for the four displays I currently use (by "UI" I mean "looks like"):

2014 MBP, 2880 x 1800 native, 1440 x 900 UI => 2 x scaling
2019 iMac, 5120 x 2880 native, 2560 x 1440 UI => 2 x scaling
Dell 27" 4k, 3840 x 2160 native, 1920 x 1080 UI => 2 x scaling
Dell 24" WUXGA, 1920 x 1200 native, 1920 x 1200 UI => 1 x scaling

I would say it's kind of the other way around. The 2014 MBP has the hardware resolution of 2880 x 1800 pixels because it's 2x of 1440x900 that Apple has been used traditionally. The point of HiDPI rendering is that the size of elements remains the same, but everything becomes sharper. In other words, 2880 x 1800 is supersampled 1440x900. That's also the reason why Apple uses 1920 x 1080 on non-Apple 4K displays — because those display sizes have traditionally been used for full HD content.


For most displays, both Apple and non-Apple, Apple's current default is integer scaling. That's because non-integer scaling causes some loss of sharpness. [...] However, exceptionally, for the displays on the 2016 and later large MBP's, Apple decided it wanted to shrink the UI to get more info. on the screen. [They may have done this for other laptops in their lineup as well, IDK.] Specifically using the 15" models as an example, they wanted the UI to be 1680x1050 instead of 1440 x 900, which meant the scaling was 2880/1680 ≈1.71 instead of 2. This caused a loss of sharpness, but Apple decided they preferred that trade-off.

Apple desired to use a higher standard resolution because that is where the industry was going to and 15" MacBooks started to look dated with "only 1440x900" (no matter how sharp it looked) when everyone else has been offering Full HD for a while. So they changed the default resolution setting and later (with M1 machines) upgraded the panel to slightly improve the image quality.


Thus ppi does matter if you have a target UI size, and you want to maintain integer scaling.

The software scaling is always integer. The scaling relative to the hardware panel depends on the resolution you choose. You obviously lose some accuracy if your hardware resolution is lower than the resolution of the backing buffer, but the more important question is whether that accuracy loss is perceivable. For example, I am not able to see any difference in perceived sharpness between my older 15" and the M1 16". Some people might. But at some point it just doesn't matter.

To make an extreme argument, you could display a 100K image on a 10K, 15K, 30K or 50K hardware panel without any human being able to see any difference even from the closest space. Our eyes have a limit too, after all. I am sure that there are precise calculations done on this very topic, but from the top of my head I would speculate that for a 16" panel the reasonable limit should be somewhere around 5K-8K. Any higher resolution than that would be pointless to human vision. And we are already damn close to this limit. Which is probably why Apple was not in a rush to increase the panel resolution. In fact, I am surprised that they did it at all — miniLED backlights are a much more dramatic change for image quality than the slight increase in resolution.

Given the above, does this suggest to you that Apple might want to move all its higher-end displays to 254 ppi?

Depends on the resolutions they target. Going higher PPI would only make sense if they want to increase the resolution across the board. Apple likes to keep font sizes relatively similar across devices, but I am not sure how the math works out with the modern generation. On desktop that is around 218ppi and even the newest Apple displays seem to follow suit. One thing to mention is that desktops are usually viewed from further away than laptops, so "smaller UI" is probably less common there. Basically, the closer you hold the device, the higher the PPU. Which makes sense.

Following that logic, no, I don't expect Apple's desktop displays to use higher PPI any time soon.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Apple likes to keep font sizes relatively similar across devices, but I am not sure how the math works out with the modern generation.
I don't have a 2016 MBP to check this. But: Wasn't the whole point of the switch in the default "looks like" resolution on the 2016+ large MBP's (from 1440 x 900 to 1680x1050 on the 15") to make everything visually smaller (by a factor of 1440/1680 ≈ 0.86), which changed the default size at which fonts were displayed? I.e., isn't that the essence of it?

You've used the word "dated". But the only change from this switch in default UI was to magnification. So isn't that really what "dated" means here—that the magnification was considered too large?

And, given that, isn't this statement, from the documentation you linked (which was written in 2012), out of date? I.e., with the change in 2016, didn't they set aside size-invariance?:

"The result is that if you draw the same content on two similar devices, and only one of them has a high-resolution screen, the content appears to be about the same size on both devices...Size invariance is a key feature of high resolution."


Come to think of it, they didn't have size invariance even back in 2012, when this article was written, because my 4k 27" (163 ppi) of course has a different default UI size from both the Retina monitors and low-DPI monitors on the market at the time. The size invariance that existed in 2012 was only between Retina monitors and low-DPI monitors, since their ppi's varied by a factor of ≈2. I'm not complaining about this. I'm just stating what the facts appear to be.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,677
I don't have a 2016 MBP to check this. But: Wasn't the whole point of the switch in the default "looks like" resolution on the 2016+ large MBP's (from 1440 x 900 to 1680x1050 on the 15") to make everything visually smaller (by a factor of 1440/1680 ≈ 0.86), which changed the default size at which fonts were displayed? I.e., isn't that the essence of it?

You've used the word "dated". But the only change from this switch in default UI was to magnification. So isn't that really what "dated" means here—that the magnification was considered too large?

And, given that, isn't this statement, from the documentation you linked (which was written in 2012), out of date? I.e., with the change in 2016, didn't they set aside size-invariance?:

"The result is that if you draw the same content on two similar devices, and only one of them has a high-resolution screen, the content appears to be about the same size on both devices...Size invariance is a key feature of high resolution."


What I mean that Apple used a "default" 1440x900 resolution in their 15" laptop for a long while, until at least 2016. While the retina MBPs were theoretically capable of displaying higher logical resolutions (at better quality that native panels), they still defaulted to 1440x900. By "dated" I mean that other 15" laptops on the market would use full HD or similar resolutions, which made the default UI on the 15" MBP too big for the contemporary taste.

So yes, you are right. While Apple traditionally used the same UI sizes across different device types (~110 ppi for Macs), they recently started to differentiate the resolutions and UI sizes based on the typical viewing distance, as to optimise the available space.
 
  • Like
Reactions: theorist9

Falhófnir

macrumors 603
Aug 19, 2017
6,146
7,001
OK, I think I have it now. Let me give it one more shot :):

For most displays, both Apple and non-Apple, Apple's current default is integer scaling. That's because non-integer scaling causes some loss of sharpness. For instance, here's the default scaling for the four displays I currently use (by "UI" I mean what Apple calls "looks like"; and "scaling" effectively means "UI magnification relative to native"):

2014 MBP, 2880 x 1800 native, 1440 x 900 UI => 2x scaling
2019 iMac, 5120 x 2880 native, 2560 x 1440 UI => 2x scaling
Dell 27" 4k, 3840 x 2160 native, 1920 x 1080 UI => 2x scaling
Dell 24" WUXGA, 1920 x 1200 native, 1920 x 1200 UI => 1x scaling

However, exceptionally, for the displays on the 2016 and later large MBP's, Apple decided it wanted to shrink the UI from 2x magnification to ≈1.7x magnification to get more info. on the screen. [They may have done this for other laptops in their lineup as well, IDK.] Specifically using the 15" models as an example, they achieved a 1.7x magnification by switching their default UI from 1440 x 900 to 1680x1050, which works out to 2880/1680 = 12/7 ≈1.71x scaling. This non-integer scaling caused a loss of sharpness, but Apple decided they preferred that trade-off.

The reason they've moved to 254 ppi in the M-series MBPs is it allows them to get the smaller UI they desire, while being able to employ the preferred integer scaling. Specifically, 2x scaling on 254 ppi will yield the same UI size as what would be obtained with 221/254*2 = 1.74x scaling on the 221 ppi 15" MBP, i.e., only slightly larger than the default 1.71x.

Thus ppi does matter if you have a target UI size, and want to maintain integer scaling.

Given the above, does this suggest to you that Apple might want to move all its high-end external displays to 254 ppi?

Initially I thought it did. I.e., I boldly (foolishly?) predicted that the 27" mini-LED/ProMotion(?) display rumored for the end of 2022 will be 254 ppi instead of 218 ppi (that would provide some additional product differentiation to help justify higher pricing vs. the 27" Studio Display), and that the rumored 7k XDR replacement will be 32" (254 ppi) instead of 36" (218 ppi).

However, upon further consideration, I'm wondering if it's only for the smaller (laptop) displays that Apple thinks this lower UI magnification is needed. After all, there's not as much need to shrink the UI for the larger displays, since they're much less space-constrained (and would also typically have a longer viewing distance).
Yep that's pretty much it, the screen's pixel density is a byproduct of the UI scale they chose and them restoring 'perfect' retina scaling after several years of using slightly mismatched logical and physical resolutions.
 
  • Like
Reactions: theorist9

tornado99

macrumors 6502
Jul 28, 2013
454
445
Nice try, but no. Your attempt at a put-down (that I was just 'making stuff up' to justify a purchase) wasn't based on claiming that the Δppi was too small to be perceivable; it was based on claiming that the threshold for perceivable improvement, for most viewers, was below 220 ppi, and likely ≈170 ppi. [Re-read your post.] But after I presented evidence debunking your claim, rather than being an adult and acknowledging you got it wrong, you're trying to hide that by pretending you were claiming something different.

Yes, I am being hard on you, but what do you expect when you start a conversation with a baseless ad hominem? You've poisoned the well here, so there's no point in you continuing on this thread. But maybe try not doing that to the OP of the next thread you join, eh?

I'm going to set aside the (weird) emotion in your response.

There are two issues here:

a) The limit of human visual perception

b) Whether the average person can tell the difference between 220 ppi and 254 ppi

The studies you cite all refer to a). However they are not looking at fonts on a desktop, they're doing synthetic tests such as a thin dark line on a uniform background. So they don't back up your point.

Secondly even if the visual limit was 500 ppi it doesn't follow that the difference between 220 ppi and 254 ppi would be noticeable. For a start, visual actuity is non-linear. The improvement between 50 ppi and 100 ppi is not the same as between 100 ppi and 200 ppi. Where's your evidence for b)?

Lastly, you can test b) yourself. Take two different phones with a small ppi difference - does one look sharper?

I have three 24 inch displays/iMac in my home office: 122 ppi, 185 ppi, and 218 ppi. The 122 ppi is a decent step up from a common 95 ppi display. The 185 ppi is similarly better than the 122 ppi. And the 185 ppi and 218 ppi are indistinguishable. If you look even more carefully into reviews comparing 165 ppi to 185 ppi, users of both do notice a small difference. So that's pretty good evidence for me that the real-life limit is between 165 ppi and 185 ppi.
 
Last edited:

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
You're both right, and wrong. Modern 10-bit displays almost always go hand-in-hand with better phosphers on the LEDs able to represent a wider gamut of colours (although yes there are a few 10-bit sRGB too).
No, I'm just right. I was making the point that 10bpc is not intrinsically linked to wider gamut, because it isn't, as you just admitted yourself. Bits per channel and gamut are independent variables in most display technologies.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
What I mean that Apple used a "default" 1440x900 resolution in their 15" laptop for a long while, until at least 2016. While the retina MBPs were theoretically capable of displaying higher logical resolutions (at better quality that native panels), they still defaulted to 1440x900. By "dated" I mean that other 15" laptops on the market would use full HD or similar resolutions, which made the default UI on the 15" MBP too big for the contemporary taste.

So yes, you are right. While Apple traditionally used the same UI sizes across different device types (~110 ppi for Macs), they recently started to differentiate the resolutions and UI sizes based on the typical viewing distance, as to optimise the available space.
The 15" MBP had a 1680x1050 option from 2010 to 2012, and the 17" model moved to a comparable 1920x1200 resolution already in 2007 or 2008. The retina laptops received a lot of criticism back in the day, and the low effective resolution was one of the better reasons.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
OK, I just realized the underlying reason for Apple's specific selection of 127 points per inch (=>254 ppi)—IT'S METRIC!

[Or, equivalently, SI.]

More precisely, these give nice round numbers when expressed metrically: Since the adoption of the International Yard about 60 years ago, the inch has been defined as 25.4 mm, exactly. Thus 127 points per inch and 254 ppi are exactly 5 points per mm and 10 pixels per mm, exactly.

****

Separately:


I just checked the scaling options on the 6016 x 3384 XDR. They're 2x, 2.35x, 3.13x, and 4x. The non-integer options cause a loss of sharpness. I would think there'd be users that would like about a 3x UI magnification, but don't want the loss of sharpness with the 3.13x option. Unfortunately, 6016 isn't exactly divisble by 3. But there is a workaround that would enable Apple to offer 3x scaling: For this specific option, render as if the screen's native resolution is 6018 x 3384; then you can get exactly 3x with a "looks like" 2006 x 1128. This won't fit on the screen, but that's not a problem—you'd have a two-column overscan, so you'd just need to discard the left and right columns. Since each column is only 0.12 mm wide, this loss would not be visible to the viewer, and certainly worth the gain in sharpness.

The same workaround could be applied to the Studio Display, as well as their other Retina displays.

If the Gen II XDR is 7k, it will likely will be either be the same 218 ppi with a larger screen (36"=> larger average viewing distance than the Gen I XDR) or the same 32" screen size with an increased pixel density (254 ppi => smaller default UI than the Gen I XDR). Either way, this increases the benefits of having a 3x scaled UI available.
 
Last edited:
  • Like
Reactions: SpotOnT

tornado99

macrumors 6502
Jul 28, 2013
454
445
OK, I just realized the underlying reason for Apple's specific selection of 127 points per inch (=>254 ppi)—IT'S METRIC!

[Or, equivalently, SI.]

More precisely, these give nice round numbers when expressed metrically: Since the adoption of the International Yard about 60 years ago, the inch has been defined as 25.4 mm, exactly. Thus 127 points per inch and 254 ppi are exactly 5 points per mm and 10 pixels per mm, exactly.

Why would that matter for a computer display?
 
  • Like
Reactions: Krevnik

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Why would that matter for a computer display?
And @Krevnik

It wouldn't matter, and that's the point.

You want to think about this like a hardware engineer or scientist, since they're likely the ones that made the selection. They knew there would be no significant functional difference between setting the default UI density to 126 or 128 ppi (or some other nearby number) rather than 127.

So what would motivate them to choose that specific number? Well, the most obvious explanation is that they were thinking metrically, which meant 126, 127, and 128 ppi work out to ≈5.03937 ppmm, 5 ppmm (exactly), and ≈4.96063 ppmm, respectively (ppmm = pixels per mm). So if they're all fucntionally the same, what do you think the scientist or engineer is going to choose? The answer is 5 ppmm, because working in exact numbers that are multiples of 5 or 10 provides a nice simplification, reduces the chance of error, and is aesthetically more pleasing. It's just silly to spec something out at, say, ≈5.03937 ppmm, when exactly 5 ppmm works just as well (assuming you're working in metric units, which I believe their choice of 127 indicates they were).

For an analogous reason, that's why we also like working in reduced units (where, e.g., the values of c and h_bar are set to unity).
 

tornado99

macrumors 6502
Jul 28, 2013
454
445
The engineering is determined in the panel fabs belonging to companies like BOE, and actually has very little to do with Apple or OS-X. It's far more likely that it has something to do with the intricacies of the LCD production process, rather than Apple specifying they wanted exactly 254 ppi.

Also, the perceived difference in text sharpness between two displays is far more than simply the pixel density. Factors such as gamma, rgb pixel structure, polariser quality, lamination and bonding, all play a part. Furthermore LG, BOE, Samsung have completely different ways of achieving wide-angle "IPS (copyright LG)" viewing. So unless your 220 dpi and 254 dpi panels were identical in all other aspects, it's not a fair comparison.
 
  • Like
Reactions: iPadified

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
And @Krevnik

It wouldn't matter, and that's the point.

You want to think about this like a hardware engineer or scientist, since they're likely the ones that made the selection. They knew there would be no significant functional difference between setting the default UI density to 126 or 128 ppi (or some other nearby number) rather than 127.

So what would motivate them to choose that specific number?

I don't know why you mentioned me, since I ducked out of the thread after the numerology started up, but since you did, now you get me rambling again.

Apple first launched the Retina MacBook Pros with double-resolution displays to the 1440x900 displays they had been using. Then they decided to change the default "looks like" resolution to be one step up from the default ("looks like 1680x1050"). After a couple years of seeing how many users reverted back to "looks like 1440x900" and seeing how low that number was, when they had the opportunity to do the redesign for the M1, they sought out panels as close to that as they could so it would be a native 2x again with a similar amount of screen real estate per inch that they had in the previous MBP model. That's it.

There's no big master plan here, nothing hiding in the numbers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.