Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Anyone know why Apple chose 254 ppi, specifically, for the displays on its M-series MacBook Pros? It's a curious choice. It doesn't correspond to the ~220 ppi Apple has been using for displays that aren't hand-held, and it doesn't result in a standard resolution (e.g., 4k).

The only rationale I can think if is this: If Apple planned their revision of the XDR to be 7k@32", that would also have a 254 ppi density (218 x 7/6 = 254). This may be a coincidence, but it makes me wonder if Apple chose 254 ppi for the MBP's because they planned to introduce a 7k@32", and wanted to maintain a consistent UI size among them (for a given scaling ratio and OS, the UI size is determined by the pixel density).

I do like the 254 ppi on the MBP's; I find it gives text an extra edge of sharpness over the standard 220 ppi Retina display on my iMac. Yes, Steve Jobs claimed that, because of the angular resolution limit of the human eye, nothing beyond ~300 ppi at 10"–12" would provide increased sharpness (from which all the various Retina display resolutions flowed). However, other display experts have said that the effective limit of the human eye is significantly higher, indicating there is a benefit in going beyond the standard 220 ppi resolution of current Retina displays, even if they are viewed at 20"+.

See:

https://sid.onlinelibrary.wiley.com/doi/full/10.1002/jsid.186
https://mostly-tech.com/tag/steve-jobs/
https://www.cultofmac.com/173702/why-retina-isnt-enough-feature/
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
The more boring answer is that Apple’s been using a scaled resolution on their MBPs by default for years now. Using a 110 pts/inch display (@2x is 220px/inch) but rendering at 127 pts/inch. The new displays are also 127 pts/inch (@2x is 254px/inch), making the default screen resolution native instead of scaled.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
The more boring answer is that Apple’s been using a scaled resolution on their MBPs by default for years now. Using a 110 pts/inch display (@2x is 220px/inch) but rendering at 127 pts/inch. The new displays are also 127 pts/inch (@2x is 254px/inch), making the default screen resolution native instead of scaled.
Sorry, I'm not following your explanation.

I thought Apple rendered each display at some multiple of its native resolution, which is why it recommends using that, and warns people against non-integer ratios (what it calls, incorrectly, a scaled resolution, since integer ratios also represent scaling): "Using a scaled resolution may affect performance"). And the visuals are consistent with this—the ones that aren't integer multiples of native are less crisp.

But it seems what you're saying is that none of the rendering is at an integer multiple of the display's native resolution—it's always at 127 ppi, so it's always non-integer scaled (unless you've got a new 14"/16" MBP). If so, why the improvement when selecting an integer multiple of the display's native res, since you're not at integer scaling anyways?
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Sorry, I'm not following your explanation.

I thought Apple rendered each display at some multiple of its native resolution, which is why it recommends using that, and warns people against non-integer ratios (what it calls, incorrectly, a scaled resolution, since integer ratios also represent scaling): "Using a scaled resolution may affect performance"). And the visuals are consistent with this—the ones that aren't integer multiples of native are less crisp.

But it seems what you're saying is that none of the rendering is at an integer multiple of the display's native resolution—it's always at 127 ppi, so it's always non-integer scaled (unless you've got a new 14"/16" MBP). If so, why the improvement when selecting an integer multiple of the display's native res, since you're not at integer scaling anyways?

I was trying to clarify with the math as well. Be aware that pt != px here. UI layout is done in "points" on macOS and iOS, and then finalized as pixels when it's time to draw, and so when talking about usable screen real estate, points per inch is a useful metric. Apple, because they standardized on 110px/inch with the Mac pre-Retina, 110pt/inch is what the native size of displays also have been, which maps to 220px/inch (2x scaling).

What I'm saying is that previous MBPs were not using integer scaling. At least starting with the 2016 model. The new displays were adjusted to match what the default had been for around 5 years.

The math is that the defaults on the 2016 and later were laying out at ~127pt/inch, double that to get pixels and you get 254px/inch.
 

tornado99

macrumors 6502
Jul 28, 2013
454
445
I find it gives text an extra edge of sharpness over the standard 220 ppi Retina display on my iMac.

I'm not really buying this. If you happen to be a teenager, or by some coincidence have far sharper than average vision, then possibly. Otherwise I think you're saying that to justify a purchase.

Most people can't tell the difference between 320kbps mp3 and uncompressed audio. If there was also a way of blind-testing displays I expect around 170 ppi would be the threshold for a desktop monitor.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
I was trying to clarify with the math as well. Be aware that pt != px here. UI layout is done in "points" on macOS and iOS, and then finalized as pixels when it's time to draw, and so when talking about usable screen real estate, points per inch is a useful metric. Apple, because they standardized on 110px/inch with the Mac pre-Retina, 110pt/inch is what the native size of displays also have been, which maps to 220px/inch (2x scaling).

What I'm saying is that previous MBPs were not using integer scaling. At least starting with the 2016 model. The new displays were adjusted to match what the default had been for around 5 years.

The math is that the defaults on the 2016 and later were laying out at ~127pt/inch, double that to get pixels and you get 254px/inch.
I'm afraid I'm still not following this. For instance, where you write:

"...because they standardized on 110px/inch with the Mac pre-Retina, 110pt/inch is what the native size of displays also have been, which maps to 220px/inch (2x scaling)."

I would have expected:

"because they standardized on 110pt/inch with the Mac pre-Retina, 110px/inch is what the native size of displays also have been, which maps to 220px/inch (2x scaling)."

I tried googling "Apple rendering 127 ppi" and couldn't find anything substantive. Could you possibly refer me to a well-written technical article that explains this in detail?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Sorry, I'm not following your explanation.

I thought Apple rendered each display at some multiple of its native resolution, which is why it recommends using that, and warns people against non-integer ratios (what it calls, incorrectly, a scaled resolution, since integer ratios also represent scaling): "Using a scaled resolution may affect performance"). And the visuals are consistent with this—the ones that aren't integer multiples of native are less crisp.

They render at integer scaling, but that rendered image can be then downsampled using fractional scaling. Example: you choose the “highest” scaled option with the old 15” (looks like 1920 x 1200) - the OS renders at 3840x2400 and then downsamples to the native resolution of 2880 x 1800. This way they can simulate fractional scaling while keeping rendering algorithms simple and quality high.

But it seems what you're saying is that none of the rendering is at an integer multiple of the display's native resolution—it's always at 127 ppi, so it's always non-integer scaled (unless you've got a new 14"/16" MBP). If so, why the improvement when selecting an integer multiple of the display's native res, since you're not at integer scaling anyways?

I think the answer to that is algorithmic complexity. Rendering at super-resolution and downsampling will always give you the same or better quality than rendering with scaling, and integer scaling is easier to handle algorithmically. It’s a simpler model for software and UI designers to deal with. That’s why Apple was able to move things to HiDPI very quickly while Microsoft took ages.

The best solution would of course just drop pixels altogether and work with physical dimensions. The display technology is there, but unfortunately, too much software legacy to overcome.
 
  • Like
Reactions: Tagbert

Falhófnir

macrumors 603
Aug 19, 2017
6,146
7,001
They didn't choose it as such, that's just what it worked out as when they restored integer @2x scaling. Because they changed the 'looks like' default resolution from 1440x900 to 1680x1050 (15" MBP) in 2016, that changed the UI density from ~110 to ~128. The new physical resolution is just 4x the UI resolution at twice the density.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
I'm not really buying this. If you happen to be a teenager, or by some coincidence have far sharper than average vision, then possibly. Otherwise I think you're saying that to justify a purchase.

Most people can't tell the difference between 320kbps mp3 and uncompressed audio. If there was also a way of blind-testing displays I expect around 170 ppi would be the threshold for a desktop monitor.
You know, everyone else who's chimed in thus far has tried be civil, collegial, and helpful. Then you come lumbering in with statements that are by turns obnoxious, ignorant, and lazy. But hey, the internet, right?

Let's take two in particular:

1) "I'm not buying it....I think you're saying that to justify a purchase." Well that's definitely obnoxious. The fact of the matter is that I'm not actively considering purchase of an MBP and, even if I were, I wouldn't buy it for the display. That's because when my main computer was an MBP it spent 95% of its time on a desk, with me viewing it mostly through larger external monitors. My main interest is in upgrading those. Thus we can add that your statement is ignorant as well.

2) "If there was also a way of blind-testing displays I expect around 170 ppi would be the threshold for a desktop monitor." OK, that's just lazy. Why? Because I linked a double-blind study addressing exactly that, which you didn't bother to read before pontificating. Quoting from it:

"This study determines an upper discernible limit for display resolution. A range of resolutions varying from 254–1016 PPI were evaluated using simulated display by 49 subjects at 300 mm viewing distance. The results of the study conclusively show that users can discriminate between 339 and 508 PPI and in many cases between 508 and 1016 PPI." [1]

300 mm is 59% of 20", so those correspond to being able to distinguish 200 ppi vs. 300 ppi, and 300 ppi vs 600 ppi, at 20", respectively (same angular resolutions). And for those like me who often lean in closer, to a 15" viewing distance, it's 270 ppi vs. 400 ppi, and 400 ppi vs. 800 ppi.

Of course, this is just one study. But there are others that support >220 ppi as well. I'd put more stock in what they say, plus my own experience, than some rando on the internet who throws around claims about things he just doesn't understand.

[1]Spencer, Lee, et al. "Minimum required angular resolution of smartphone displays for the human visual system." Journal of the Society for Information Display 21.8 (2013): 352-360.
 
Last edited:

karen999

macrumors member
Sep 12, 2012
59
86
Not really a fan of increased PPI in the new MacBooks, I’d rather have the old 220PPI which does not make the UI elements too small.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Not really a fan of increased PPI in the new MacBooks, I’d rather have the old 220PPI which does not make the UI elements too small.
Yeah, I can understand that. Do the new MBP's have a 3x integer scaling option and, if so, does that make the UI elements too large for your taste?
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
They render at integer scaling, but that rendered image can be then downsampled using fractional scaling. Example: you choose the “highest” scaled option with the old 15” (looks like 1920 x 1200) - the OS renders at 3840x2400 and then downsamples to the native resolution of 2880 x 1800. This way they can simulate fractional scaling while keeping rendering algorithms simple and quality high.
Sorry, I'm still missing something.

To start, what's the general statment of how this works? Is it that if the user-selected ("looks like") resolution is a x b, the OS will render it at 2a x 2b, and then downsample to the next lower integer fraction of a native resolution of the monitor? E.g., using your example, how would that work if the user selected the default "looks like" of 1440 x 900?

More to the point, how would the process you described be fundamentally different for a screen that's 218 ppi vs. one that is 254 ppi? It seems the process would be the same either way, i.e., that neither of these pixel densities is "privileged" by the OS. Yet @Krevnik, whose post you agreed with (you gave it a like) seems to be saying exactly that:

"Apple’s been using a scaled resolution on their MBPs by default for years now. Using a 110 pts/inch display (@2x is 220px/inch) but rendering at 127 pts/inch."

I.e., krevnik seems to be saying there is something about 127 pts/inch that is privileged based on how MacOS does its rendering, which would in turn explain why Apple went with 2 x 127 ppi = 254 ppi on their new MBP's (which was my original question)— but I'm not seeing this from the explanations you folks have been trying to give me. So clearly there's a big missing piece I'm not getting. That's why I asked krevnik — and I'll ask you the same thing — if you might be able to refer me to a detailed, well-written technical paper on this that lays it all out.
 
Last edited:

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Maybe let me chime in, though I'm also learning.

What @Krevnik is saying is that macOS treats it's canvas as 127 points / inch. This is regardless of what output device it will be sending this canvas to. macOS does all it's drawing in this virtual canvas, as everything is a bitmap. When it comes time to output to a display device, it then takes in the device's properties and does the appropriate scaling.

So if I'm understanding it correctly, macOS will convert it's internal canvas to 2X (in the event of HiDPI) internal canvas (i.e. to 254 points / inch) and then convert it again to the target output device, which in the case of if the output device is 220 pixels / inch, macOS will downsample the 254 points / inch image to 220 points / inch. This is basically what @leman is saying if I understand it correctly.
 
  • Like
Reactions: 43n79w and Tagbert

jav6454

macrumors Core
Nov 14, 2007
22,303
6,264
1 Geostationary Tower Plaza
Apple didn't choose an X Pixels Per Inch. They just upped the resolution by twice what the MacBooks originally had. Hence PPI went up. However, to achieve the "retina" effect, the resolution was scaled to emulate the original one but with higher definition.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Maybe let me chime in, though I'm also learning.

What @Krevnik is saying is that macOS treats it's canvas as 127 points / inch. This is regardless of what output device it will be sending this canvas to. macOS does all it's drawing in this virtual canvas, as everything is a bitmap. When it comes time to output to a display device, it then takes in the device's properties and does the appropriate scaling.

So if I'm understanding it correctly, macOS will convert it's internal canvas to 2X (in the event of HiDPI) internal canvas (i.e. to 254 points / inch) and then convert it again to the target output device, which in the case of if the output device is 220 pixels / inch, macOS will downsample the 254 points / inch image to 220 points / inch. This is basically what @leman is saying if I understand it correctly.
I'll think about what you said—thanks for trying to throw me a lifeline—but, in the meantime, why would everything be a bitmap? E.g., when I work in Adobe Illustrator, the graphics are vectorized rather than bitmapped. Maybe you're saying that, at the final stage, everything needs to be converted to a bitmap—that would make sense to me, since it has to be converted to a bitmap so the pixels can be addressed.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Maybe let me chime in, though I'm also learning.

What @Krevnik is saying is that macOS treats it's canvas as 127 points / inch. This is regardless of what output device it will be sending this canvas to. macOS does all it's drawing in this virtual canvas, as everything is a bitmap. When it comes time to output to a display device, it then takes in the device's properties and does the appropriate scaling.

So if I'm understanding it correctly, macOS will convert it's internal canvas to 2X (in the event of HiDPI) internal canvas (i.e. to 254 points / inch) and then convert it again to the target output device, which in the case of if the output device is 220 pixels / inch, macOS will downsample the 254 points / inch image to 220 points / inch. This is basically what @leman is saying if I understand it correctly.
OK, I think I have it: There's TWO STEPS here.

For the first step, the Mac needs to convert its internal bitmap (127 pts/in) to the native resolution of the display. If the display is 254 ppi, then that's ideal, because after doing integer upsampling to 254 ppi, no non-integer downsampling is needed (the latter would, by contrast be needed with, say, a 218 ppi display). Question: How much effect does this have on quality?

Since I don't know the proper term of art, let's call the output from step one the "external bitmap", which is now in ppi

For the second step, the Mac needs to take the external bitmap and apply it to the display. If the user wishes the actual sizing to correspond to some integer fraction of the resolution of the display, that's ideal, because no non-integer scaling is needed.

In summary, there are two potential sources of artifacts:
Type 1: Those incurred in going from the 127 pts/in internal bitmap to a display whose native resolution is not an integer multiple of this.
Type 2: Those incurred in displaying the output of step 1 at a non-integer fraction of the native resolution of the display ("non-integer scaling").

Thus:
254 ppi display used at integer scaling: No Type 1 or Type 2 artifacts
254 ppi display used at non-integer scaling: Type 2 artifacts only.
220 ppi display used at integer scaling: Type 1 artifacts only.
220 ppi display used at non-integer scaling: Type 1 and Type 2 artifacts.

But this still leaves the question—why 127 ppi in the first place? Given that their early LCD monitors were typically ~100-110 ppi, wouldn't one of those resolutions have made more sense (so they could avoid the Type 1 artifacts for such displays)?
 
Last edited:

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
I don’t think any major OS uses vectorised graphics as it’s internal graphics engine.

When upscaling if I’m not wrong anti-aliasing will be done on the upscaled bitmap and likely some other image enhancing techniques, producing a higher quality base to down sample from, so it shouldn’t cause too much of artifacts unless of course the target resolution causes loss of fidelity.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Apple didn't choose an X Pixels Per Inch. They just upped the resolution by twice what the MacBooks originally had. Hence PPI went up. However, to achieve the "retina" effect, the resolution was scaled to emulate the original one but with higher definition.
But 254 ppi isn't twice what the MacBooks originally had, i.e., they weren't 127 ppi. The early Macbooks were 13.3"@1280x800 => 113.5 ppi. As for the early MBP's: 15.4" = 110 ppi; 17" = 116.5 ppi. The later 17" was 133 ppi. So still not 127. There may have been a 127 ppi Apple monitor, but I couldn't find it.
 

jav6454

macrumors Core
Nov 14, 2007
22,303
6,264
1 Geostationary Tower Plaza
But 254 ppi isn't twice what the MacBooks originally had, i.e., they weren't 127 ppi. The early Macbooks were 13.3"@1280x800 => 113.5 ppi. As for the early MBP's: 15.4" = 110 ppi; 17" = 116.5 ppi. The later 17" was 133 ppi. So still not 127. There may have been a 127 ppi Apple monitor, but I couldn't find it.
After the redesign it was. Again, Apple didn't pick that number. It just happened to be the one after they double resolution just like they did on the iPhone 4.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
After the redesign it was. Again, Apple didn't pick that number. It just happened to be the one after they double resolution just like they did on the iPhone 4.
In your previous post you said "They just upped the resolution by twice what the MacBooks originally had", so I calculated the ppi's for the the original MacBooks. But now you're saying that it was a MacBook after a redesign. I'm not sure what redesign you're referring to, since there have been multiple generations, so you could you please give me the year, size and model?
 
Last edited:

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
The first retina MacBooks
Nope. The Retina MacBook was 226 ppi. See: https://en.wikipedia.org/wiki/Retina_display

The only 127 ppi Mac I've been able to locate is a 2016 MacBook Air (non-Retina), found from among this long list of all of Apple's 2016 displays: https://www.theverge.com/tldr/2016/...d-screen-sizes-pixels-density-so-many-choices

I find it hard to believe that, in 2016, during the Retina era, Apple would build the rendering specs for its OS around the display on a lone non-Retina Air.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Not really a fan of increased PPI in the new MacBooks, I’d rather have the old 220PPI which does not make the UI elements too small.

You can still select the “everything looks bigger” scaling option which would give you same sizes as the older model, with the same visual fidelity.

To start, what's the general statment of how this works? Is it that if the user-selected ("looks like") resolution is a x b, the OS will render it at 2a x 2b, and then downsample to the next lower integer fraction of a native resolution of the monitor? E.g., using your example, how would that work if the user selected the default "looks like" of 1440 x 900?

The OS renders everything at 2x2 (so at 4x super-resolution). The rendered image is then resampled to the native resolution (most likely using a linear filter). This technique is also known as super-sampling antialiasing.

In the case of 1440x900 the OS renders at 2880x1800. Then this image is displayed using native resolution of the display. On an older MBP, no resampling is needed as the resolution matches the image. On the new MBP it is resampled to match the 3456x2234 resolution. No information is lost as we are upsampling onto a very dense raster.

More to the point, how would the process you described be fundamentally different for a screen that's 218 ppi vs. one that is 254 ppi? It seems the process would be the same either way, i.e., that neither of these pixel densities is "privileged" by the OS. Yet @Krevnik, whose post you agreed with (you gave it a like) seems to be saying exactly that:

It’s not different at all. The PPI literally doesn’t matter.
"Apple’s been using a scaled resolution on their MBPs by default for years now. Using a 110 pts/inch display (@2x is 220px/inch) but rendering at 127 pts/inch."

I.e., krevnik seems to be saying there is something about 127 pts/inch that is privileged based on how MacOS does its rendering, which would in turn explain why Apple went with 2 x 127 ppi = 254 ppi on their new MBP's (which was my original question)— but I'm not seeing this from the explanations you folks have been trying to give me. So clearly there's a big missing piece I'm not getting. That's why I asked krevnik — and I'll ask you the same thing — if you might be able to refer me to a detailed, well-written technical paper on this that lays it all out.

It’s simple, really. Apple used the resolution of 1440x900 for a long while on their large laptop line, and the HiDPI resolution of that is 2880x1800. But at some point, with world moving towards FullHD on compact laptop, that resolution was dated, so they moved the default to 1680x1050, but that would involve a slight loss of visual fidelity since the super sampled 1680x1050 didnt exactly map to the hardware resolution. Now with the M1 model they have increased the PPI to improve the fidelity. That’s it.

For a technical paper you can refer to Apple documentation here: https://developer.apple.com/library...d.html#//apple_ref/doc/uid/TP40012302-CH4-SW1


OK, I think I have it: There's TWO STEPS here.

For the first step, the Mac needs to convert its internal bitmap (127 pts/in) to the native resolution of the display. If the display is 254 ppi, then that's ideal, because after doing integer upsampling to 254 ppi, no non-integer downsampling is needed (the latter would, by contrast be needed with, say, a 218 ppi display). Question: How much effect does this have on quality?

Since I don't know the proper term of art, let's call the output from step one the "external bitmap", which is now in ppi

For the second step, the Mac needs to take the external bitmap and apply it to the display. If the user wishes the actual sizing to correspond to some integer fraction of the resolution of the display, that's ideal, because no non-integer scaling is needed.

In summary, there are two potential sources of artifacts:
Type 1: Those incurred in going from the 127 pts/in internal bitmap to a display whose native resolution is not an integer multiple of this.
Type 2: Those incurred in displaying the output of step 1 at a non-integer fraction of the native resolution of the display ("non-integer scaling").

Thus:
254 ppi display used at integer scaling: No Type 1 or Type 2 artifacts
254 ppi display used at non-integer scaling: Type 2 artifacts only.
220 ppi display used at integer scaling: Type 1 artifacts only.
220 ppi display used at non-integer scaling: Type 1 and Type 2 artifacts.

But this still leaves the question—why 127 ppi in the first place? Given that their early LCD monitors were typically ~100-110 ppi, wouldn't one of those resolutions have made more sense (so they could avoid the Type 1 artifacts for such displays)?

I think you are massively over complicating all this :)

Its all very simple really. First, forget about native resolution, that’s a red herring. The most important thing is that you have three kinds of “pixels” in play:

- logical pixels, called points (the pixels OS and your UI work with)
- rendered pixels, called the backing pixels (the pixels the system actually is rendering )
- hardware pixels, the “real” pixels, ones that the computer can display

In the good old times, points and backing buffer were the same thing. You set the system to 1440-900 resolution, you got a 1440x900 framebuffer, you rendered to it, you sent that off to the display. If the framebuffer you sent does not match the hardware resolution, well, you will probably get a blurry picture. But the most important thing is that neither the system nor the apps know or care about the hardware resolution, for them there is only one resolution they work with, and that’s 1400x900.

Retina displays change the equation by differentiating between the logical pixels and the backing pixels. You still get 1440x900 logical pixels (points), but each point is now backed by 2x2 pixels in your framebuffer (backing buffer). So you render stuff at supersampled resolution and then send that supersampled framebuffer to the display. And just like before, you have to interpolate so that it fits the hardware resolution. Only now you have many more hardware pixels and can get away with more things without visibly degrading image quality. But again, neither the system nor the app cares about the “native resolution”. They only care about points and pixels. That’s it.

The beauty of this system is that it’s very simple and provides best possible image quality. Loss of visual fidelity can only occur at the presentation stage - when the backing buffer is scaled to the hardware display, but since you are rendering at a supersampled resolution you always get the best possible rendering. And the software only ever has to deal with the basic SSAA - the fact that a point is backed by 2x2 pixels, which makes everything simpler. Fractional scaling, native resolution - you don’t have to care about these things because they kind of happen implicitly at the presentation time.

Contrast it with the method used by other systems, where the backing buffer would match the hardware resolution and you instead need to render things at fractional scaling directly. So instead of fixed 2x2 backing factor that can be hard-coded you need to tweak your rendering to match the chosen scaling factor. At first, it sounds like a less wasteful system (fewer pixels to process compared to Apples super-sample everything approach), but processing more pixels is extremely cheap on modern hardware while fractional rendering can be algorithmically challenging and all thst burden is now on the developer. Apples approach trades off some extra (negligible) work to give you best possible image quality while making developers life as simple as possible.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.