... 4K falls right in the 163ppi range, which means the GPU is having to calculate / render that at like 60xsecond, and thus, he recommends a 1440p monitor (which, 5K, cut in half by the Mac, would fall exactly at 1440p, thus no scaling), instead of 4K.
Worth watching from the beginning, and he gets into a interesting chart at about the 4m05s mark:
VIDEO:
Mac Scaling Explained
One thing that Hunter mentions early on in his video is that he found varied information in online forums and such that were all just "people's opinions..." but what he
doesn't mention is that in reality, his video falls much into the same vein; he's just offering yet another opinion -- his just happens to be in video format instead of text. And while he makes a point of backing up his opinion with technical statistics, I would argue that those statistics may or may not be relevant to your specific use case.
Take me for example: I paired up my 5K 2019 iMac with a
4K Dell monitor which I purchased in April 2021, and I've been quite satisfied with it -- but I'm admittedly not the same kind of "graphics specs geek" as Hunter. As a software engineer and casual gamer, I use a very different mix of software on my Mac than Hunter uses for his professional graphics workload, so unsurprisingly, I've never observed any of the rendering delays that he bemoans. Beyond that, my own configuration is different than his in other ways: my 4K monitor has always performed double duty, connected to a Windows based gaming laptop on one side and the iMac on the other side, switching the 4K secondary monitor between the two computers depending upon what I'm doing. This "split-use case" is quite different than Hunter's use case and tends to make me want to lean into the higher resolution as more of a preferred feature, rather than color accuracy and such.
So really, you need to decide what's important for
your specific use case; if you have a professional graphics heavy workload like Hunter, maybe his analysis has relevance to you.