Exactly what I was talking about.
High RAM usage points to sloppy programming in general. Yes, usually. Except when the high-memory-footprint algorithm or data structures are vastly more performant than the alternative. But it's mostly true.
We want our processes' memory footprints to be as small as possible in order to fit as many of them as possible into physical RAM. A high-memory-footprint process negatively affects other processes and memory-starves them. Processes COMPETE for RAM.
OSes on the other hand don't compete for RAM. They manage it. They own it. They own all the processes and every last bit of memory on the machine. The OS itself has a memory footprint of its own (the kernel, drivers, processes integral to the system and user experience) and let's be honest, macOS is not the tiniest OS around, but there is no one to memory starve except the OS. The OS doesn't owe anyone anything. Its only responsibility is to provide fast memory to processes (in layman's terms), and how it does that is none of the user's (or processes') business.
OSes use different pagers with different strategies, there's virtual memory, there's memory compression, there are swapfiles, page caches and more. There is literally no "RAM usage" metric anymore that would be comparable between different OSes with different memory management.
macOS will show you some values in Activity Monitor, but those values are to be interpreted by professionals, in the context of macOS, and ideally in the context of "macOS running on a X GB RAM machine".
My MacBook has 16 GB of RAM and its memory usage is currently over 8 gigabytes.
Does it mean that my GF's older 4 GB MacBook would die in the same situation? Not at all, I only have Safari open with three tabs, Sublime Text and Skype, nothing else. The rest is macOS knowing we're on a 16 GB machine used as a software development workhorse and managing the memory accordingly.