To be precise, you just want the UI scaling that is a byproduct of the HiDPI version. To put it in other words, if you could have a UI scaling option you wouldn't bother with the "fake HiDPI" resolution. Correct?
If you mean UI scaling like Windows, I think it would be a nice feature to have. Similar to the experimental arbitrary scaling macOS had in Leopard and Snow Leopard.
Seriously, running a 1280x800 UI on a 30" screen? Seriously??
Not really. Someone else maybe. If I need a screenshot of HiDPI, I can choose 1280x800 HiDPI, or I can choose 2560x1600 HiDPI which looks very much like 2560x1600 on my 1560x1600 display. I could go nuts and do 3272x2045 HiDPI (I think I have an 8K x 4K limit on my old Mac GTX 680).
Of course, but, are we really talking about shrinking pixels here? Pixels of your monitor are shrinking down when you select a HiDPI resolution?
What kind of sorcery modifies the pixel size on my monitor? I want that.. If we are really talking about this kind of magic, I rest my case.
I mean getting a display with more pixels. Or having the display further away.
They are wasted. There are no additional antialiasing benefits from doing this. In fact, sometimes there are maybe errors while doing this on text (let's call it fuziness). Antialiasing is correctly calculated on the native pixel grid of the screen. We are talking about text here, not real-time graphics (which would be benefited from this kind of supersampling, in the excess of extra GPU sweat).
Antialiasing of text is calculated on the pixel grid of the framebuffer.
In addition, I've looked at some test patterns with alternating color patterns (no antialiasing on the pixel grid of the framebuffer - the patterns were checkerboard, vertical lines, horizontal lines, etc.). When the framebuffer is scaled for output to the display, pixels are averaged or interpolated - which means the information of the extra pixels is not completely thrown away. black and white vertical lines appear gray after scaling down - this means the black and white pixels were both used in the final image - pixels are not skipped. You loose their exact position, contrast, and separation but that's unavoidable when you reduce the number of output pixels.
I'm not sure what real-time graphics has to do with scaling - every frame output to the display has the scaling, whether the scaling is done once for each change to the framebuffer, or done for every frame of the refresh rate.
Then again, you have the case of bitmaps. To make it simpler to digest, you can imagine a 2560x1440 wallpaper bitmap. It would fit pixel-perfect on your 1440p monitor. But, in the "fake" HiDPI mode, the bitmap is first upsampled to whatever "fake" HiDPI resolution you choose and then downsampled in the last step to fit your native pixel grid. If you are ok with working with assets that are presented to you after a dual pass of "sampling", no worries then. I just feel pity for any "professional" that chooses to work that way.
I've done a test of this as well using GraphicConverter.app. The upsampling for the HiDPI framebuffer appears to be perfect for a 200%x200% scaling. It doesn't seem to be using a fuzzy scaling like when I use my GPU to scale 1280x800 (not HiDPI) to 2560x1600 (not HiDPI). My display is an Apple 30" Cinema Display - if it receives a 1280x800 signal, then it uses a non-fuzzy perfect scaling up to 2560x1600 so text is blocky instead of fuzzy. The Apple 30" Cinema Display doesn't have the smarts to scale any other input signal.
If you have arbitrary scaling instead of just 200%x200% then you can get upscaling problems since it's no longer perfect. Maybe that's one reason Apple choose 200%.
There is a problem with the alternating black and white patterns in downscaling though. Sometimes they are perfect, and sometimes they are gray. I think they turn gray when the image is drawn starting from an odd pixel. for example, if a display is set to 2560x1600 HiDPI on a 2560x1600 display, and you have a 640x480 image with some single pixel horizontal lines and some single pixel vertical lines, and the image is drawn using 1280x960 pixels to the HiDPI framebuffer, then everything is perfect if the image is drawn on an even pixel (0, 2, 4, ...). If x is odd, then the horizontal lines are gray. If y is odd then the vertical pixels are gray. If both x and y are odd, then both the horizontal and vertical lines are gray.
Moving a window in HiDPI mode moves it two pixels at a time so you don't see shimmering that way.
Resizing a window in HiDPI mode resizes it by two pixel increments. If the image is centred in the window, then that's when you see the shimmering (where the two pixel wide black and white lines alternate between being perfect and being gray as you continue changing the size).
Scrolling the window does not always use two pixel increments (because the increments are a ratio of the image size and window size) so shimming can occur with horizontal lines when scrolling vertically, or with vertical lines when scrolling horizontally.
With your wall paper example, the image is always aligned at 0,0 and it's static so you don't see the animated shimmering that you see with scrolling or resizing. Plus wall papers usually don't have single-pixel details/objects.
To sum it up, what you get with these "HiDPI hacks" is simply a workaround to UI scaling that results in bigger text (with wasted rendered pixels + any downscaling errors)*. Is it worth it? Is it worth it more than investing in a better screen with proper "sharpness" (it doesn't have to be high resolution / high DPI)?
*The only "valid/correct" HiDPI hack is that of running half the native resolution in HiDPI. But I'm seriously asking, can you run a 1280x800 UI on a 30" screen? Or a 1280x720 UI on a 27" screen?
Depends on what you're doing. I like 1600p low res on my 1600p display. I use 3008p HiDPI on my 4K.
I specifically asked
@StellarVixen in a pm about it (Do you really use your computer on 720?), after seeing
this. He refused to answer directly.
He said it looks nicer. I'm not going to argue with that.