Mac OS X Leopard Server performs really well in a VM, but starting with Snow Leopard, each successive version of Mac OS X/OS X/macOS leans heavier on pushing more rendering work to APIs that prefer to work on a GPU. Like OpenGL, Core Animation, OpenCL, etc.
Since that's translated to the CPU through a series of abstraction layers when you run it on a VM (the OpenGL 2.1 software-rendering driver, Core Image on software, etc), that's where you're seeing utter four star performance.
(This isn't entirely true. For instance, 10.8 performs better than 10.7, but this trend towards leveraging GPUs usually holds true for most macOS updates. I think El Cap was also an improvement for VMs from Yosemite, but I don't know why.)
My best advice is, with exceptions (El Cap preferred instead of Mavericks + Yosemite, Mountain Lion preferred instead of Lion), use the earliest possible major version of macOS that you can in a VM for the best performance.
This is an area that's absolutely worth requesting as a desired feature request in Feedback Assistant or (for devs) as a Radar. I've been told that there are Apple engineers who want to make the state of graphics support for macOS in a VM better, but that they need more Radars to escalate that effort.