I thought it's a brand name for specific models?
When it comes to scaling, it doesn't matter who makes the display, it matters what its resolution is.
Happy to help. If it's within your budget (~$1300), and your Mac is new enough to drive it, I believe the best-performing option (other than the super-expensive 6k Apple Pro display) is the 27" 5k LG. I've not used it myself but, in addition to being retina resolution, I believe it has the same panel used on the 27" iMac. Essentially, if you like how Catalina looks on a 27" iMac, then you should like this as well, since the two should look the same (unless there are calibration differences, which you can adjust yourself).I guess I'll have to try. Thanks for the advice!
If it's within your budget (~$1300)
That's an extremely expensive budget :/ I've been a PC gamer all my life and I've never spent so much on a screen.
Yup, that's the problem with the changes Apple keeps making in how they render text. They keep forcing upgrades to successively more expensive monitors.That's an extremely expensive budget :/ I've been a PC gamer all my life and I've never spent so much on a screen.
That's an extremely expensive budget :/ I've been a PC gamer all my life and I've never spent so much on a screen.
I have a 1080p monitor (at the moment) and fonts look AWFUL. They're great on Windows and Ubuntu, however, so I don't understand why my Mac fonts look so bad. Something to do with "retina" displays, perhaps? I'm tempted to buy a 4K monitor just to fix this issue...
Ubuntu fonts look like complete trash on 1080p displays.
That's correct. High Sierra is the last OS to incorporate subpixel rendering natively.To confirm, High Sierra definitely does not suffer from the same rendering issues that Mojave and Catalina do? I thought I'd read otherwise, so I am currently sitting on Sierra on my '12 MBP -> '12 iMac as external.
That's correct. High Sierra is the last OS to incorporate subpixel rendering natively.
It can be added back using a Terminal command in Mojave, and many feel that improves Mojave, but it still doesn't make it look quite as good as High Sierra. I would guess that is because High Sierra was built to work with subpixel rendering, while Mojave was not, but I don't know the exact reason. I just know that when I tried Mojave with the Terminal hack, it improved things, but text was still more fatiguing to read than in High Sierra.
You can test yourself whether you do or don't mind the change in the newer OS's. Just create a new partition (I don't know what the minimum is, but you don't need that much space -- I used 100 GB, so I'd also have room to install apps and test how they worked, but I think you can do it with much less), and install one of the newer OS's on it, and see for yourself. That's what I did before upgrading, which made me decide not to upgrade.
Yeah, the 2015 MBA has only ~130 ppi (135 for 11", 128 for 13"), so it definitely would not look good with Mojave/Catalina. I've got 163 ppi on my 27" 4k, and even that's not good enough. You really need a retina (~220 ppi) display.Yup, I did that with Mojave—tested on a partition. I tried some Terminal hacks including increasing the font weight (like so). I still didn't care for it and there aren't features in Mojave/Catalina that I would trade for bad font rendering. I do also have a '15 MBA currently running Catalina and the issue is subtle(?) but obvious on that machine. It really boggles my mind how Apple thinks this is OK.
How do I disable if I dont like it?It's not perfect, but you can re-enable the subpixel anti-aliasing (reboot after running).
Code:defaults write -g CGFontRenderingFontSmoothingDisabled -bool false
defaults delete -g CGFontRenderingFontSmoothingDisabledHow do I disable if I dont like it?
Apple do have another trick up their sleeve - super sampling. Your desktop is actually rendered at twice your physical resolution, and then the GPU does hardware scaling to make it fit. This does make text sharper but still not as good as sub-pixel. This used to be enabled through a hack, but I believe in Big Sur it is default. If you go to About Mac > System Report and check the Displays/Graphics properties it should give you the framebuffer size. Is this twice your screen resolution?
In what OS's does Graphics/Displays show framebuffer size? I've got High Sierra, and it shows framebuffer depth only. Can you post a screenshot?
Yeah, I see something like that as well. But you were saying you can check for supersampling by seeing if your framebuffer depth is twice your screen resolution. But the two have nothing to do with each other -- you can't ask whether "4096x2304 is twice 24 bit Colour" -- it makes no sense, because their units are different. One is measuring no. of pixels, the other the color depth. The framebuffer depth listed here just means that each of the three color subpixels will have 8 bits of color resolution (3 x 8 = 24). It has nothing to do with screen resolution.In Catalina under Hardware>Graphics/Displays, it shows for me
Resolution: 4096x2304
Ui Looks like: 2048 x 1152 @ 59 Hz
Framebuffer Depth: 24 bit Colour
Yeah, I see something like that as well. But you were saying you can check for supersampling by seeing if your framebuffer depth is twice your screen resolution. But the two have nothing to do with each other -- you can't ask whether "4096x2304 is twice 24 bit Colour" -- it makes no sense, because their units are different. One is measuring no. of pixels, the other the color depth. The framebuffer depth listed here just means that each of the three color subpixels will have 8 bits of color resolution (3 x 8 = 24). It has nothing to do with screen resolution.