Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

rechner

macrumors newbie
Original poster
May 29, 2020
3
0
I *know* that the Apple Pro Display XDR is a 6K display. It says so everywhere. Yet, macOS Catalina (10.15.5) is reporting my XDR display as an 8K display:

XDR.png


Clearly, something is not right. That said, using Parallels Toolbox resolution switcher I can set resolutions of 3840x2160 (retina) as well as 7680x4320 (and it looks different than 6016x3384).

If I take a screen shot, Shift+Cmd+3, at either 3840x2160 or 7680x4320 the resulting image is 7680x4320 pixels.

So, either this really is an 8K display (really?!?) or something else is going on.

I'm stumped. Anyone got any ideas on what might be behind this?
 

joevt

macrumors 604
Jun 21, 2012
6,971
4,262
macOS reports the framebuffer size and framebuffer pixel format. Neither is related to the output resolution or output pixel format. Apple should really fix that UI.
SwitchResX will show the output resolution / timing.
The AGDCDiagnose command will show that and pixel format (depending on the GPU).

The GPU scales the framebuffer to create the output signal which in your case will probably be 6016x3384.

SwitchResX will show all the resolutions available and will show which ones are scaled. HiDPI resolutions (for Retina displays - where "UI Looks Like" is half the width and height of the framebuffer resolution) are ones where the text is drawn twice as wide and twice as tall (four times more detail for smaller pixels).

Usually all the scaled resolutions have different framebuffer sizes that are all scaled to the same output resolution (in your case, probably 6016x3384). Every scaled resolution above a certain size creates two modes - one that is HiDPI (with width and height being half the framebuffer width and height) and one that is not.

You can go into the Displays preferences panel, hold Option key, then click "Scaled". Resolutions that say "(low resolution)" are not HiDPI.
 

Christopher Kim

macrumors 6502a
Nov 18, 2016
768
741
Yea, it's because you're currently running the monitor as "Looks like 3840 x 2160" or 4K, which is not a "perfect multiple" of your actual 6K monitor (which would be either 1x1 at 6016x3384 or pixel-doubled at 3008x1692).

I currently run my 27" LG 4K monitor at "Looks like 2560 x 1440". When I see my system report, it shows this:

Screen Shot 2020-05-29 at 2.06.33 PM.png


So it shows the resolution as 5K and UI looks like 2560 x 1440. @joevt may be correct in terms of how it actually works (I don't claim to know), but I've always thought of it as what Mac does for any in-between resolutions is double-it (so take my "looks like" 2560 x 1440" and HiDPI scale it to 5120 x 2880), then size it down to the actual pixels of my monitor (which is 4K, 3840 x 2160). And that's why I've understood that running my 4K monitor at "Looks like 2560 x 1440" on a Mac looks much better than if I were to run same monitor with Windows at a 2560 x 1440 resolution.

If you were to change your resolution to be either "Looks like 6016 x 3384" or "Looks like 3008 x 1692", I imagine looking in your same system report window, it would show the Resolution as 6K.
 

rechner

macrumors newbie
Original poster
May 29, 2020
3
0
Thanks for the responses. I installed SwitchResX and can now see that it's listing both the 3840x2160 and 7680x4320 as "scaled" resolution options. It is interesting to me that when capturing a screenshot it respects that scaled resolution even if it's greater than the maximum actual resolution. That's pretty clever.

Cheers!
 

joevt

macrumors 604
Jun 21, 2012
6,971
4,262
Thanks for the responses. I installed SwitchResX and can now see that it's listing both the 3840x2160 and 7680x4320 as "scaled" resolution options. It is interesting to me that when capturing a screenshot it respects that scaled resolution even if it's greater than the maximum actual resolution. That's pretty clever.
Capturing a screenshot uses the framebuffer size - that's where everything is drawn in memory. I don't think the system will let you see the scaling result that is transmitted to the display (but Apple could make that possible or could make a reasonable facsimile - or you can manually scale your screenshot). After transmitting to the display, the display can do additional scaling. For example, a 4K display can receive a 1440p signal and scale it to 4K.

The scaling algorithm of a display can be different than the scaling algorithm of the GPU. For example, my Apple Cinema 30" display (2560x1600) can receive a 1280x800 signal and scale it exactly by quadrupling the pixels. It appears very sharp/pixelated. My GPU can scale a 1280x800 frame buffer to 2560x1600 using some kind of interpolation which makes it look smooth - very different than the display scaling.

A third type of scaling is the draw scaling. For example, macOS has HiDPI mode that draws everything 4 times larger/more detailed. In Windows, you can set a UI scaling percentage up to 300% but it's not as consistent as macOS.

If you double click the resolution in the Current Resolutions tab of SwitchResX for the display, then it will show the output timing info but even that info is not real for a display that uses two DisplayPort signals - one for the left side and one for the right side of the screen - for example, the LG UltraFine 5K display, or the iMac 5K retina display. In those cases, SwitchResX shows a pixel clock like 938 MHz but in reality it's two DisplayPort signals having pixel clocks of 469 MHz. The output of AGDCDiagnose has that info.
 
  • Like
Reactions: WP31

rechner

macrumors newbie
Original poster
May 29, 2020
3
0
@joevt You are precisely correct. Here's what SwitchResX tells me about the timing:

Screen Shot 2020-05-29 at 4.18.02 PM.png


Thanks for all the detailed info on what goes on behind the scenes.
 

joevt

macrumors 604
Jun 21, 2012
6,971
4,262
@joevt You are precisely correct. Here's what SwitchResX tells me about the timing:
Thanks for all the detailed info on what goes on behind the scenes.
The XDR display is interesting because it supports
a) Single DisplayPort 1.4 for 6K resolution using HBR2 + DSC for Macs that have GPUs that support Display Stream Compression (AMD Navi graphics like the W5700X, Intel Gen 11 graphics of the 10th gen CPU, Nvidia RTX in Windows).
b) Dual DisplayPort 1.4 for 6K resolution using two HBR3 signals for Macs that support DisplayPort 1.4 but don't support DSC.
c) Dual DisplayPort 1.2 for 5K resolution using two HBR2 signals for Macs that support DisplayPort 1.2.
d) Single DisplayPort 1.2 for 5K resolution using one HBR2 signal (6 bpc in Windows). I don't think there's a 5K single HBR3 mode (would allow 8 bpc).

You can't tell the difference between (a) and (b) using SwitchResX. Both will say 6016x3384@60.000Hz 210.960kHz 1286.01MHz h(8 32 40 +) v(118 8 6 -). AGDCDiagnose will show the difference.
 
  • Like
Reactions: WP31

frou

macrumors 65816
Mar 14, 2009
1,394
2,003
The scaling algorithm of a display can be different than the scaling algorithm of the GPU. For example, my Apple Cinema 30" display (2560x1600) can receive a 1280x800 signal and scale it exactly by quadrupling the pixels. It appears very sharp/pixelated.
That's an exceedingly rare feature on computer monitors in my experience. Several times in the past, I've wished to be able to get that result, and instead been met with some braindead soft-looking interpolation.

I always just assumed that not much thought went into the cheap generic scaler chips, or at least the implementation of them in typical monitors.
 
Last edited:

sakagura

Suspended
Feb 29, 2020
86
131
I *know* that the Apple Pro Display XDR is a 6K display. It says so everywhere. Yet, macOS Catalina (10.15.5) is reporting my XDR display as an 8K display:

View attachment 919785

Clearly, something is not right. That said, using Parallels Toolbox resolution switcher I can set resolutions of 3840x2160 (retina) as well as 7680x4320 (and it looks different than 6016x3384).

If I take a screen shot, Shift+Cmd+3, at either 3840x2160 or 7680x4320 the resulting image is 7680x4320 pixels.

So, either this really is an 8K display (really?!?) or something else is going on.

I'm stumped. Anyone got any ideas on what might be behind this?


Nvidia drivers on Windows has a feature called DSR that lets you use an 8K desktop scaled down on to a 4K screen. Everything looks tiny so you have to set the scaling to 200% so you can read the fonts and icons. It's a bad idea anyway because it breaks some apps.
 

joevt

macrumors 604
Jun 21, 2012
6,971
4,262
Nvidia drivers on Windows has a feature called DSR that lets you use an 8K desktop scaled down on to a 4K screen. Everything looks tiny so you have to set the scaling to 200% so you can read the fonts and icons. It's a bad idea anyway because it breaks some apps.
DSR: https://techreport.com/review/27102/maxwells-dynamic-super-resolution-explored/
Seems like it's a feature specific to gaming?

macOS scaled resolutions are not specific to gaming. The max scaled resolution is limited by the GPU and drivers. I can do a 16:9 resolution of 14400x8100 with an AMD RX 580 (scaled to 4K output). Scaled resolutions is how macOS does most or all the modes you see (from low res like 640x480 to high res like 8K). SwitchResX will show which resolutions are not scaled in bold if you have the "Show best resolutions for display in bold" option selected.

HiDPI is similar to a Windows 200% scale (or is it 400% since it doubles the width and height of text and objects - have to check Windows to find out what they mean by 200%). In earlier versions of macOS the scale could be arbitrary like Windows. I haven't tried forcing an arbitrary scale in current version of macOS (change the 2 to a 3 in the code that adds the HiDPI mode from a scaled mode).

I wonder if the scaling that macOS does affects performance? macOS is usually always slower than Windows for games.
 

Christopher Kim

macrumors 6502a
Nov 18, 2016
768
741
I wonder if the scaling that macOS does affects performance? macOS is usually always slower than Windows for games.

Pretty sure it does - I've seen a few threads on here that talk about the performance hit of HiDPI scaling like this.

Here's one I found:
 

joevt

macrumors 604
Jun 21, 2012
6,971
4,262
Pretty sure it does - I've seen a few threads on here that talk about the performance hit of HiDPI scaling like this.
Right, if you increase the size of the framebuffer then there's more pixels to draw which takes more time.

But my question is: if you don't change the frame buffer size, does scaling by the display perform much better than scaling by the GPU? For example, with a 1440p frame buffer, does a 1440p output to a 4K display (scaling done by display) perform noticeably better than a 4K output (scaling done by GPU)? This depends if the scaling task can be done in parallel on the GPU.
 

joevt

macrumors 604
Jun 21, 2012
6,971
4,262
I can do a 16:9 resolution of 14400x8100 with an AMD RX 580 (scaled to 4K output)
This might only be true if macOS thinks your display uses multiple tiles (even if the display is only connected with one DisplayPort cable/signal). Otherwise the limit is 8Kx8K. I need to do more testing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.