Effectively, "looks like 2560x1440" mode on a 4k display renders to 5k (5120x2880) internally and then does a non-integer (but high quality) downsampling of that to 4k. If you take a screen shot in that mode you'll get a 5k image. I don't know that its
literally rendering to an internal, full-screen 5k buffer and then downsampling the whole thing - I'd assume its doing something rather more efficient. Still, the likely problems are the GPU power needed for downsampling and an increased demand on video RAM for the buffering.
Thing is, there's really no reason to
expect the Intel 630 graphics to do this smoothly: we're talking about the lowest common denominator iGPU designed so that basic Windows business desktops can skip the cost of a dGPU, whereas more powerful machines will be fitted with PCIe GPUs. Typically, those machines
won't be coming with 4k displays and - even if they do - Windows uses a dynamically resizable UI that makes icons and system fonts a usable size in regular, 1:1 4k mode (as does Linux). That has its problems - the Mac "scaled modes" approach may be better, but - like most of MacOS - assumes a half-decent GPU.
So, I guess the message is, if you're getting a Mac Mini, either get:
- A 2560x1440 display (a bit 2010, but really not bad)
- A 21" 4k display that looks OK in basic "Looks like 1920x1080" pixel-doubled mode
- A 40"+ 4k display that is usable in "raw" 4k mode (your eyesight may vary).
- An eGPU that costs $700 for a $200-$300 GPU (when all you need is a $100 GPU) - might not be such a bad idea if you need a $1000 GPU, although it still defeats the object of having a tiny, self-contained computer.
I guess it's Apple's way of saying that you should get an iMac....