For those still interested in this debate, here is an article that helps to clear up some confusion:
The Intel Core i9-9900K at 95W: Fixing The Power for SFF
It also helps to show that both sides of the argument have some validity.
This is especially highlighted in their conclusion of a 95W restricted 9900K:
Overall, it acts like a 9900K [i9] in single thread mode, and like a 9700K [i7] in multi-thread mode.
So, really, the new iMac i9—
thermally restricted to ~85 watts—is really no better, and in some cases worse, then the 9th gen i7 chip. Though, to be fair, we need to wait until more directly comparable reviews are published about actual performace of the actual iMac.
Regarding thermals, bear in mind the results we've seen thus far are on a brand new machine. What happens when you start getting dust on the heatsink, for example? Thermal performance will never improve, only worsen throughout its lifespan. The i5 is also turbo boosting nearly at Intel's max spec, whereas the "i9" is far from. It's not unreasonable to see this gap widen.
There has always been some amount of variance between the implementations of a given processor, but, I think Apple just set the standard for the largest deviation with the "i9" iMac. This is my contention.
Why does
Puget System's i9 achieve a roughly 15–20% better Cinebench R15 score than the iMac. R15 also doesn't load the processor as realistically as R20, either. So, as people have pointed out, it sounds like most, if not all business class machines are running the 9900k in an already restricted mode. So, if this can be confirmed, what's with the huge discrepancy in benchmarks of an already throttled 9900k vs the iMac?
Is this not due to thermal throttling?