I get where Apple is coming from on this. Even with vector drawing for fonts, most UI elements, etc. there are still a lot of bitmaps around and you're going to see scaling artifacts at non-integer scaling factors there. Even for vectors, at typical historical screen resolutions you'd want to do a lot of hinting to maintain fidelity at scaled resolutions. You don't want all of your crisp 1 pixel lines turning into blurry 1.3 pixel lines (i.e. 1 pixel black lines next to 30% gray lines when rendered with antialiasing) when running at 130%.
While true, Windows actually renders the UI
natively at the scale you selectat least if you stick to the preset 100/125/150/200% options. Most things look crisp at all these sizes.
You have the option to select an arbitrary scale factor as well, but I have never seen this produce good resultsthough I have not tried it on Windows 8.1
When you are dealing with legacy applications that require the UI to be scaled up, moving to 2x definitely produces better results than 125% or 150%.
However, once you have pixels so small you can't clearly distinguish them individually, these problems largely go away. With a pixel grid twice as fine, any scaling artifacts are half as severe. Current retina-class screens are not
quite as high-res as you'd want them to be for these purposes (per
Nyquist you'd really want pixels
half the size of the smallest distinguishable features, not about the size of the smallest distinguishable features). But I think they're good enough that allowing non-integer scale factors is becoming the right choice.
In theory that's true, but I remain unconvinced. Current UI elements are already going beyond the Nyquist limit as they frequently consist of high contrast, single pixel width lines, so you would need a
lot of resolution to hide the scaling artifacts.
This is why I don't think the retina scaling options on the MacBook Pros are acceptable, and I wish they would offer higher resolution panels. (equivalent to twice the resolution of the previous "high resolution" panels)
When using the scaled resolutions, Apple still renders the UI at 2× and then scales the final image down to match the display resolution. Text quality really suffers in my opinion. (based on the 13″ rMBP I own)
As I've noted before, Apple had functional if buggy implementations of arbitrary UI scaling available to developers in previous OS releases, so it's not like OS X's graphics engine fundamentally can't do this or something. They just need to decide to enable it, squash some bugs, and give third-party developers some time to squash bugs on their end. And I suspect just supporting 1x and 2x resolutions has resulted in a substantial fraction of the work required to support arbitrary resolutions having already been done.
I would be surprised if Apple changed the way they handle scaling. Rendering at 2× and scaling the final image
does solve a lot of problemsparticularly when dealing with non-retina applications.
That being said, non-retina applications do still look pretty rough on OS X, even though it's being rendered at an "ideal" 2× scale. I think you really need to just accept that they are going to look bad no matter what.
When dealing with these displays where 1× is too small, and 2× is too large, Windows' scaling does look a lot better than the way OS X handles it. (either not giving you the option, or rendering at 2× and scaling the image to fit)
Application support for retina scaling is very far behind on Windows though, and there's a massive library of legacy applications that will probably never have scaling support added.
Most Mac developers, if they are still updating their product, have already added retina support, or at least intend to. I have spoken with a number of developers for Windows applications, and most of them couldn't care less about adding scaling options.