Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mszilard

macrumors regular
Original poster
Oct 16, 2012
197
96
Is there a performnace impact using the highest resolution (More space) vs using Default resolution? I'm not talking about external displays, but the performance of a M1 Macbook Pro.
 
  • Like
Reactions: rmadsen3

CWallace

macrumors G5
Aug 17, 2007
12,525
11,542
Seattle, WA
As I understand it, the GPU renders at the default resolution (so 3024x1964) and then scales it to whatever resolution has been selected (Default or Scaled). I suppose scaling at 200% (Default for Display) is easier than the others so there is likely a bit of a performance hit (and the Displays app notes choosing a scaled display can affect performance).
 

joevt

macrumors 604
Jun 21, 2012
6,963
4,257
As I understand it, the GPU renders at the default resolution (so 3024x1964) and then scales it to whatever resolution has been selected (Default or Scaled). I suppose scaling at 200% (Default for Display) is easier than the others so there is likely a bit of a performance hit (and the Displays app notes choosing a scaled display can affect performance).
I think that's backwards. The GPU renders at whatever selected resolution (so anything up to 6048x3928) and then scales that to the display's resolution (3024x1964).

If you choose the Default for Display mode "Looks like 1512x982" (a HiDPI or Retina mode), the rendering resolution is actually 3024x1964 with text and objects being drawn twice as wide and tall as non HiDPI/Retina mode and no scaling needs to occur.

If you choose the "More Space" option ("Look like 3024x1964") then it probably is not a HiDPI/Retina mode. The rendering resolution is 3024x1964 and no scaling is required. If for some reason this mode is a HiDPI/Retina mode (using a 6048x3928 frame buffer) then scaling will occur.

You need an app like SwitchResX to see if a mode is scaled or not. A display timing may have a different resolution than the display - in that case scaling is done by the display. Usually a scaled mode uses the native display timing. If it does not, then scaling may be done by both the GPU and the display.
 

mszilard

macrumors regular
Original poster
Oct 16, 2012
197
96
Thank you for the replys.
It seemed to me that with the "More space option" scrolling was more jittery than with the default resolution. It didn't affect nor memory nor cpu usgae, but it scrolling was worse.
 

Significant1

macrumors 68000
Dec 20, 2014
1,686
780
Thank you for the replys.
It seemed to me that with the "More space option" scrolling was more jittery than with the default resolution. It didn't affect nor memory nor cpu usgae, but it scrolling was worse.
It renders the image with 2xheight x 2xwidth of the selected resolution and then scales it down to the physical resolution. So a higher resolution is more demanding, but even the default resolution is being scaled.

If you download apps like easyres, quickres etc. you can select native resolutions without retina scaling. But they will either be blurry or render the UI incredibly tiny.
 
  • Like
Reactions: throAU

Pro Apple Silicon

Suspended
Oct 1, 2021
361
426
Is there a performnace impact using the highest resolution (More space) vs using Default resolution? I'm not talking about external displays, but the performance of a M1 Macbook Pro.
FWIW, I am running two (2) LG UltraFine 4K displays @60 connected to my M1 Pro. Both of them at scaled resolution. The tests I've done show there is almost no additional GPU load running these displays vs running just the built in display at default resolution.
 
  • Like
Reactions: mszilard

PeterLC

macrumors member
Jul 26, 2016
53
15
Mid-Canada
FWIW, I am running two (2) LG UltraFine 4K displays @60 connected to my M1 Pro. Both of them at scaled resolution. The tests I've done show there is almost no additional GPU load running these displays vs running just the built in display at default resolution.
By the way, I've been searching for a 24" LG UltraFine 4K display and can't find one that claims to run on a Mac and is not back-ordered. Which model # do you run, please?
 

Tagbert

macrumors 603
Jun 22, 2011
6,254
7,280
Seattle
Is there a performnace impact using the highest resolution (More space) vs using Default resolution? I'm not talking about external displays, but the performance of a M1 Macbook Pro.
It’s not really stressing the GPUs on these things to render the screen no matter what resolution you set it at.
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
If you choose the "More Space" option ("Look like 3024x1964") then it probably is not a HiDPI/Retina mode. The rendering resolution is 3024x1964 and no scaling is required. If for some reason this mode is a HiDPI/Retina mode (using a 6048x3928 frame buffer) then scaling will occur.

A quick comment on this: HiDPI/retina simply means that a "logical pixel" (what software sees as pixel) is not the same as the actually rendered pixel. In HiDPI mode on a Mac, a single logical pixel is represented by a 2x2 array of backing pixels. In your example (3024x1964), the resolution software sees is 1512x982, with each pixel is rendered with sub-pixel precision using 4 pixels in memory. The resulting 3024x1964 buffer is then downsampled to the native resolution of the display.
 

joevt

macrumors 604
Jun 21, 2012
6,963
4,257
A quick comment on this: HiDPI/retina simply means that a "logical pixel" (what software sees as pixel) is not the same as the actually rendered pixel. In HiDPI mode on a Mac, a single logical pixel is represented by a 2x2 array of backing pixels. In your example (3024x1964), the resolution software sees is 1512x982, with each pixel is rendered with sub-pixel precision using 4 pixels in memory. The resulting 3024x1964 buffer is then downsampled to the native resolution of the display.
1512x982 is wrong.

In my example, I said "Looks like 3024x1964" (reported by macOS to the user). The resolution software sees is 3024x1964 with scale 1 or 3024x1964 with scale 2 (which is drawn to a frame buffer that is 6048x3928).

If you take a screenshot when using this mode, you'll get a png file that is 3024x1964 pixels (if scale 1 is being used) or 6048x3928 (if scale 2 is being used).
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
In my example, I said "Looks like 3024x1964" (reported by macOS to the user). The resolution software sees is 3024x1964 with scale 1 or 3024x1964 with scale 2 (which is drawn to a frame buffer that is 6048x3928).

Ah, yes, sorry, I misunderstood, my bad. But do they even offer a looks like 3024x1964 mode?
 
  • Like
Reactions: Tagbert

Tagbert

macrumors 603
Jun 22, 2011
6,254
7,280
Seattle
Ah, yes, sorry, I misunderstood, my bad. But do they even offer a looks like 3024x1964 mode?
I have an M1 MBA with a 13” screen with a physical resolution of 2560x1600.
The built-in Display tool only allows a max resolution setting of 1680x1050. i usually use 1440x900. Both are HiDPI.
That tool will let you set the resolution of an external 4K display to 3840x2160.

SwitchResX does let you use 2500x1600 for the built-in display but, wow! everything is so tiny!
I assume that SwitchResX will allow you to choose the native resolution of your displays, too.
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
That tool will let you set the resolution of an external 4K display to 3840x2160.

I assume that is not a HiDPI mode then? Need to check when I am in the office, I didn't know they allowed you to select the full resolution now...
 

Tagbert

macrumors 603
Jun 22, 2011
6,254
7,280
Seattle
I assume that is not a HiDPI mode then? Need to check when I am in the office, I didn't know they allowed you to select the full resolution now...
That is not HDPI, just direct native resolution.

For the MBA, the highest HiDPI res is 1680x1050
For a 4K monitor, the highest HiDPI res is 3008x1692

i would expect a max HiDPI for the new MBPs to be somewhere in between that.
 

x3n0n1c

macrumors regular
Jul 9, 2014
185
28
Short answer, yes there will be a performance impact, though small.

The reason bumping your scaling setting up to "more space" causes a bit of impact is because you are rendering a higher than native resolution image which is being scaled down to fit your screen.

Taking the 16.2" as as example, it has a Native Resolution of 3456x2234. Because the resolution of this screen is too high to render each pixel 1:1, as things would be super tiny, apple uses a HiDPI mode that scales the UI to match the size of lower resolution screen.

Apple makes this simpler for users by just labeling these modes from Larger Text, to Default, and to More Space at the high end.

What these are doing internally are rendering your screen at double the listed "looks like" resolution in each axis. One scaling option will always be half your native resolution in each axis, and this will be the most clear as it will map the pixels 1:1. Which one is matches your native resolution will vary by which laptop you have. In the case of the 16.2" the "Default" scale is 1:1.

If you increase it to "More Space", your system has to render your screen at a higher resolution than your Native Display resolution, increasing the workload on your system. In the 16.2"'s case, the "More Space" option will actually render at 4112x2658. You can confirm this by taking a screenshot and note the screenshot's resoltion. It will match the rendering resolution and not your screen's native (unless you're using your machines 1:1 scaling mode).

A downside to this is this also works for lower scaling modes. They will render below native res and actually look blurry.
 

Alameda

macrumors 65816
Jun 22, 2012
1,270
866
A quick comment on this: HiDPI/retina simply means that a "logical pixel" (what software sees as pixel) is not the same as the actually rendered pixel. In HiDPI mode on a Mac, a single logical pixel is represented by a 2x2 array of backing pixels. In your example (3024x1964), the resolution software sees is 1512x982, with each pixel is rendered with sub-pixel precision using 4 pixels in memory. The resulting 3024x1964 buffer is then downsampled to the native resolution of the display.
Just curious… where’s the magic? This has been possible any time you run a display at half its horizontal and vertical resolution… unless Apple does special rendering of fonts and photographs, for instance.
 

throAU

macrumors G3
Feb 13, 2012
9,198
7,344
Perth, Western Australia
It renders the image with 2xheight x 2xwidth of the selected resolution and then scales it down to the physical resolution. So a higher resolution is more demanding, but even the default resolution is being scaled.

If you download apps like easyres, quickres etc. you can select native resolutions without retina scaling. But they will either be blurry or render the UI incredibly tiny.

This. All non-custom-third-party resolutions are scaled using the GPU.

Unless you're on some massively underpowered GPU configuration I wouldn't worry about - just run whatever resolution works for you.

I sometimes switch between default and "more space" on my machines depending on what I'm doing and to be honest haven't noticed any real difference in perceived speed.

I'm sure you could maybe benchmark it to show a difference on some of the iGPU models, but my last machine I was doing this with was a 2020 air and it worked just fine.

So yeah, just run whatever works best for your workload and change as appropriate.
 

Romahaaa

macrumors member
Feb 24, 2021
36
1
Checked all the comments here and still couldn’t make the conclusion. Just yesterday watched the video on Youtube where author explained his issue on MBP m1 Pro with external 4k monitor attached. He said that working on some heavy GPU tasks like 3D rendering, etc his mac lagged so much on scaled resolution. To be more exact it works fine on 4k and default but all the rest between lags. If you used a 4k monitor on mac you may know that not default not 4k res are good. Too big/too tiny.

From my experience having MBP 2015 I had same problem and even without using hatd GPU tasks. My coding editor (Rider) lagged sooo much that I just couldn’t use it.

Checked this topic to find the answer, was that issue solved or still exists because planning to upgrade my mac to M1, but didn’t get any exact answer. I guess the only solution is to use 5k monitor and use 1440p resolution which will work fine even in my old mac, but the price of 5k monitors is insane
 

Basic75

macrumors 68020
May 17, 2011
2,098
2,446
Europe
Suppose you have a 3840x2160 pixel 4K/UHD display.

You have two choices of "native resolution".

One is 3840x2160 where each layout pixel corresponds to exactly 1 pixel on-screen.

The other is 1920x1080 HiDPI where each of the 1920x1080 layout pixels corresponds to exactly 2x2 pixels on-screen, thereby using all 3840x2160 pixels but in a different way (everything will be twice as large on-screen).

Then you can use scaled resolutions.

Traditionally this meant the computer outputting a different resolution and letting the display scale to its native resolution, so if you output 1920x1080 to our 3840x2160 display you'll get a pixel-perfect 200% scaling courtesy of the display which means everything will look exactly like on a 1920x1080 display of the same size, or if you output, say, 2560x1440 then the scaling will be "somewhat" blurry because it's not pixel perfect.

With retina-capable Macs scaled resolutions usually work differently. If, still assuming our 3840x2160 display, you choose a resolution like "looks like 2560x1440", which is 2560x1440 HiDPI rendered as 5120x2880 actual pixels which is then scaled down to 3840x2160 by the GPU, and the result sent to the display.

This will be less blurry than just rendering and sending 2560x1440 pixels to a 3840x2160 display, independent of whether the GPU or the display do the scaling to the final output resolution, but it's still not pixel-perfect and it places the burden of scaling to the display's resolution on the GPU.
 

Romahaaa

macrumors member
Feb 24, 2021
36
1
@Basic75 yes, correct. From what I understood choosing 1.x scale leads to lags because of GPU load. And this is pretty clear in my case with old MBP 2015 Retina (though it's not that weak still) but there is a big question why this still a problem on newest M1 Pro, according to the video I mentioned.

Same time, I watched another video recently where the author attached 3 external monitors to M1 Pro Max, and checked some task performance (they related more on RAM and CPU load actually, but he also played 4k videos in each monitor browser window) and didn't notice any performance drops... weird
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
@Basic75 yes, correct. From what I understood choosing 1.x scale leads to lags because of GPU load. And this is pretty clear in my case with old MBP 2015 Retina (though it's not that weak still) but there is a big question why this still a problem on newest M1 Pro, according to the video I mentioned.

Same time, I watched another video recently where the author attached 3 external monitors to M1 Pro Max, and checked some task performance (they related more on RAM and CPU load actually, but he also played 4k videos in each monitor browser window) and didn't notice any performance drops... weird

The GPU overhead is negligible. Even M1 can resample 8K images many thousands of times per second without blinking.

But I can imagine that under certain conditions memory bandwidth can become an issue. You have to move frames from the GPU to RAM and then back to the display controller, and with high resolution and large redraw areas the base d width required can become significant (no idea if the final downsampling is done on the GPU or by the display controller, if it’s the former than you have two more image transfers). M1 only has 70GB/s, so if you are running multiple high-res displays while also doing bandwidth-intensive work it can come to contention.
 
  • Like
Reactions: rmadsen3
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.