Aside from the first sentence, this is just a nonsensical statement.
LOL, well, whether you understand it or not, you seem to be drawing the wrong conclusion. I'd suggest you check out this
Apple Developer article on HiDPI explained. Also, it's actually 1
point in user-space that represents the 4 pixels, which normally I would say is being pedantic, but in this case, maybe it would help you to
understand the concepts better.
You are simply wrong about this. Have you actually used a 4K display and tried the various HiDPI settings? If you had, you'd know that isn't true. Because points in HiDPI can be expressed as floating points, it doesn't have to be "perfect multiples".
Think of it this way...
- Take a screenshot of a desktop on a 27" 5K iMac (5120x2880) using HiDPI 1440p.
- Make two copies, and using photoshop, scale one copy down to 3840x2160 and the other down to 1920x1440.
- Display the original screenshot on the 5K iMac, the 4K reduction on a 27" 4K display and the 1440 reduction on a 27" native 1440p display respectively.
- The results in order of best looking is: 5K iMac, 4K display, 1440p display.
If you want to insist that the native 1440p looks better than the non-"perfectly" scaled 4K, you're entitled to see it that way, but that's kind of like sticking your head in the sand.
I really wouldn't care all that much to carry this on, but there are so many users just finding out about 4K, and they read this stuff, and misinformation just gets spread over and over again.
Starting with the aRGB, you've been offering a lot of suspect advice in this thread and being rather pushy about it.