The only 100% hard data I have is the one I got myself from the past (88-92C on 2.4GHz, 103C + mild throttling on 2.6Ghz under the same 100% CPU stress test scenario) I don't remember exact TB numbers, but it's typically 0.1-0.2Ghz fluctuation for 2.6Ghz model, which was bringing it down to the lower-end model range.I don't have mine yet unfortunately. And it's not that I 'prefer not to listen,' I just don't blindly believe anything someone says. I like hard evidence. That's why I attempted to find cache power consumption data.
My main concern was those +10C, not the performance difference. I hardly feel even +30% performance boost in real world unless I run something very long and very intensive (comsol simulation, video encoding or a retarded unoptimized compiler) and use a timer