Subpixel antialiasing had been supported for 20 years. I'm not sure why Apple had to remove it when they were still selling and supporting Macs with non-retina displays.
One word: transparency. They didn't have to remove it if they didn't bring so much transparency into Mojave when they added dark mode, but they made a choice between transparency without color artifacts, or no transparency, and they chose the former.
Subpixel antialiasing works because you know what's underneath when you draw the character, and you can do the blending right then and there. Once the character is drawn, the context is effectively lost. However, with the advent of Core Animation and GPU-accelerated drawing, something like a text field may no longer sit on a solid background, especially if that background is a "material" that blends with the other content on the desktop (such as the translucent backgrounds added in Mojave). Then it becomes impossible to know how to do the subpixel blend when drawing the character, as it is only known at final compositing time, where you no longer have the context required to blend it properly.
One project I worked on had to selectively disable the subpixel antialiasing depending on exactly what was happening with text. If we detected that the background was solid fill, we enabled subpixel AA. But if the background was clear/transparent, which meant a clear background CALayer, we had to disable it, or it'd look even worse with color fringing where the subpixel AA assumed the wrong background color for the pixel. Something I also encountered playing with Core Animation when it first came out.
This is the same issue Apple was facing. In Mojave, if you enabled subpixel AA and turned dark mode on, you could see the color fringing from the text. In light mode it wasn't really as apparent since the AA was assuming a white background which is generally "close enough" even with some light translucency.
By the word "suck" in the name of this thread, we can conclude that Macs do not provide optimally sharp text on 163ppi displays. The question is who's to blame? The display maker/buyer for choosing a pixel density right in the middle of the "bad zone". Or Apple for not completely eliminating the effect by software? Anti-Aliasing whether on pixel or sub-pixel level is a blurring effect. It can only make blocky text look smoother not sharper.
Generally, Windows can do sharper text at the same "scale factor" compared to macOS. But a good chunk of that is that Windows achieved resolution independence (in an opt-in way for apps, and they'd look pretty crusty if they don't opt-in and handle it themselves), while Apple did not. Apple instead has to downscale any non-integer "scale factor" from the next integer up, which adds a full screen AA effect that cannot do the sort of things that font AA (subpixel or not) cannot do. And it affects more than just text as a result.
Honestly the biggest engineering short-cut that Apple has made, which has also bit them in the butt as Apple went an entirely different direction than the rest of the display market. That said, it was a lot faster for developers to hop on board.