You have no idea what you're talking about. "Work" is a physics term. In the case of electronics, Heat and Work are nearly the same because all energy put into the component will be inevitably output as HEAT, barring anything that produces light (LEDs may output something like 90% light and 10% heat with the energy they are given... I don't remember the exact ratio).
As far as your assertions about in what kind of situation a GPU will last longer, you make little sense. You (again) don't understand that heat and work are essentially the same, but you're also assuming that HEAT = TEMPERATURE, which it does not. Yes, high TEMPERATURES will reduce the life of electronics, but as reported in numerous threads the ATI 4850 GPU is perfectly within the norms of ALL GPUs. It idles around 40°C-50°C and doesn't normally go above 80°C and this is perfectly acceptable. Almost ALL video cards achieve these same temperatures. And, again, just because most GPUs have about these temperatures says NOTHING about their WORK or their HEAT. Are you understanding yet??
An ATI 5970 is going to require a LOT more energy, and thus will output a LOT more heat than the iMac 4850. And of course the Wattage requirement of the GPU is more or less a gauge of its WORK and thus HEAT, but says nothing about TEMPERATURE! I bet you the 5970 operates within 5°C of the 4850 under idle/load, and you know why??? A more efficient heatsink/fan. It's probably outputting 2-4x the heat, if not more, but remaining the same temperature because HEAT != TEMPERATURE. Got it yet?
And just because the 5970 may operate at the same TEMPERATURE while doing more WORK (and thus outputting more HEAT) says absolutely nothing about the longevity of the card versus the 4850.
So please, the next time you say "Yeah... No" make sure you have some idea about the subject.