You said yourself previously it throttles as in it reduces performance by 200Mhz at peak heat. This is as I have found with friends iMacs of that era, which is why I have yet to upgrade to an inferior design compared to the 2011 iMac I have.
I said it throttles when both the CPU and GPU are pushed to the max, yes. They are not when you play Civ. And an i7 4790K at 3.8GHz is still 300MHz faster than the i7 4770K, oh and it's also a lot faster than the top-end 2011 iMac with the i7 2600 at 3.4GHz. There's about a 14%IPC gain on top of the MHz bump.
However, you are half my age and I have used computers since 1980, I am fully aware of what hardware is capable of what, and what software is capable of what.
Right... Let's talk about age... Because that's a flawless argument, right? As a 40 year old, you must know more on every single subject than a stupid 20 year old, right? Cause there's no possible way that I could have even a snippet of information that you don't and therefore, all of this is pointless. Cause even every single song I know, you also know, because you are twice my age. As we all know, age is the determining factor in technological prowess... An argument like that really pushes me close to using profanity, but I'll keep restraint. I hope however that you can see it's an illogical argument, and I hope we can keep this conversation on a factual rather than personal level from now on. Thank you.
In the instance of gaming on a blockbuster game optimally designed for newer hardware than what you have, you will not get the best performance out of it if you don't compromise a few settings to allow for the age of your hardware.
Minimum settings at 2560x1440 is a rather big compromise when the exact same title can get 55-60FPS on high settings 3840x2160 under Windows with exactly the same iMac, don't you think? That pretty much undermines the whole "age of the hardware" argument, since it's the same machine. Also, the R9 M295X is a Tonga GPU, which is the second newest GCN architecture. When production of Civ 6 was wrapping up, the Polaris GPUs weren't out and definitely weren't the optimisation target. Of course there are many layers here, drivers, game software, graphics API, etc. all can increase or decrease performance on certain architectures, but the original Firaxis version of the game requires Fermi/GCN1.0 cards, and keep in mind that it's a mid-range Fermi from 2010.
And as someone who began programming before the original mac was launched, I find it amusing that you consider your iMac as top of the line more than 6 months after you bought it.
A lot has changed since then. For instance, Moore's Law no longer holds as true as it did then, and Intel no longer updates their architecture every second release.
And I don't consider my iMac top of the line relative to all iMacs. But it is a top-of the line 2014 iMac, as I've ben saying. No iMac was released in 2014 that beats it, thereby, top-of the line 2014 is a correct statement. As I said however, the only iMac with superior hardware, only beats it by like 3% depending on the task. Best GPU in the newest iMac is only 200GFLOPS faster (mine being 3.5TFLOPS), and the CPU has a lower turbo (running Civ turbo is sustained pretty much all the time), but higher IPC so that nearly balances out.
And again - Aspyr have already talked to me about this problem, and they admit a very large performance penalty due to having to use OpenGL 4.1 since Metal as of now isn't mature enough and Apple hasn't updated OpenGL. Considering the Windows version is written against DX11/DX12, well there's a lot of features missing from OpenGL 4.1 that they actually have to use software rendering for, and... Well, try doing the Utah Teapot with software rendering and you'll get what I mean.