Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple hasn't ordered any new Nvidia card since the iMac 2013 though. Since then it's all Intel Iris or AMD FirePro and R9 here. The rMBP 2014 are still using GTX 750m. I assume there's been a discord between Apple and Nvidia. It's all Red Team from here I think.
 
I have no issues with AMD R9 295X being inside my Rimac, at the time of design it was the only M-GPU apple could use as the NV 980 just wasn't available to apple.

What gets me with the whole thing is apple could have changed the design of the RiMAC to accommodate the higher thermal output of the AMD m-gpu's. its policy of thinner design is its own Achilles heel in dealing with the higher temps.

if we look at Rmbp design its the same thing, ever thinner design and let the CPU and GPU overheat and throttle back with poor heatsink design inside. Mainly because of lack of space to get the heat away from the note books. having just re applied my macbook pro tim for its heatsink i can tell you a stamped out bit of copper push fitted inside an aluminium heat pipe system thats basically flat because of lack of space and light weight which is also a design requirement isn't good enough.

I would like to see apple concentrate a little bit more on design of thermal management inside these expensive machines we are buying at great cost instead of just the outward look of the product. after all not everyone can afford to just replace there MBP or iMac's every 3 years to feel safe with applecare.

My message to apple would be, lets get away from this thinner is better design flaw and get back to the apple of old. if its an apple machine, it may be expensive but will last for years and be reliable until its end of days usefulness. I don't want to see my expensive purchase throttling the CPU or GPU because it can't deal with the heat because of the design.
 
Absolutely, but I'm pretty sure, like many people, - we'd have been happy to wait....

*You*, me and a few people would have been happy to wait. For the vast majority the M295X is OK.

It's not about you being personally inconvenienced by the delay, it's about a multi-billion dollar business that would be impacted because of a small % of users.

While Apple can request custom parts, they cannot change the laws of physics. The 980M (GM200) was not ready in time. The available evidence indicates the 980M did not "tape out" until approx. June: https://en.wikipedia.org/wiki/Tape-out
http://www.overclock.net/t/1501230/3dcenter-nvidia-gm200-chip-spotted-for-the-first-time

From that point it takes at least four months to finalize the product after the earliest samples. That takes us until October. October would be be the first point where mass 980M manufacturing could *begin*. Unfortunately that was way too late for the retina iMac. The *final* chip has to be integrated into the iMac system, that design frozen, final all-up testing done, pilot production, checks, any needed fixes made, final production ramp up, then time to accumulate volumes for the launch. It's not like buying a single part for a home-built PC.

There is no question I'd also prefer the 980M, but there were other factors in play. You are right the 980M is so superior it will be interesting to see if the next retina iMac stays with AMD.
 
if the next imac will have 980m and not the next 1080M or whatever is like standing still in dGPU. Yes 980M is better but not that better to call the next generation imac in October 2015 to have an actual and the best option for dGPU point of view. So i hope for October 2015 imac the next Nvidia dGPU and not the previews model
 
*You*, me and a few people would have been happy to wait. For the vast majority the M295X is OK.

It's not about you being personally inconvenienced by the delay, it's about a multi-billion dollar business that would be impacted because of a small % of users.

While Apple can request custom parts, they cannot change the laws of physics. The 980M (GM200) was not ready in time. The available evidence indicates the 980M did not "tape out" until approx. June: https://en.wikipedia.org/wiki/Tape-out
http://www.overclock.net/t/1501230/3dcenter-nvidia-gm200-chip-spotted-for-the-first-time

From that point it takes at least four months to finalize the product after the earliest samples. That takes us until October. October would be be the first point where mass 980M manufacturing could *begin*. Unfortunately that was way too late for the retina iMac. The *final* chip has to be integrated into the iMac system, that design frozen, final all-up testing done, pilot production, checks, any needed fixes made, final production ramp up, then time to accumulate volumes for the launch. It's not like buying a single part for a home-built PC.

There is no question I'd also prefer the 980M, but there were other factors in play. You are right the 980M is so superior it will be interesting to see if the next retina iMac stays with AMD.

Again, I don't believe that Apple didn't have access to something better. And with regards losing money, first - reputation = money, and second - think how much more money Apple would make with a small delay and not with a GPU that in my mind is a giant rip-off. The 295X should have been standard, and a 980M or equivalent should have been the $250 upgrade.

This feels nothing more than planned obsolescence. Apple is not stupid. They did this for a reason, in the same vein of 1GB RAM in the iPhone 6/6 Plus line vs 2GB in the iPad Air 2 a month later.
 
Again, I don't believe that Apple didn't have access to something better. And with regards losing money, first - reputation = money, and second - think how much more money Apple would make with a small delay and not with a GPU that in my mind is a giant rip-off...

Well why stop there? Why not just demand a Skylake CPU right now, and in production quantities? It doesn't work that way.

Re that "small delay", the Mac business alone is $6 billion a year. If they had to delay one quarter, that could equate to a lot of money. There's also a knock-on effect of all other products sync'd to that release, which would also require delaying.

Re reputation = money, is there any indication *whatsoever* this has hurt Apple's reputation in the *general* iMac market -- NOT the enthusiast market? Everything is a tradeoff. Don't think your own needs and desires are perfectly representative about the general market. As the famous quote says, "it's not about you".

That said, I fully agree the 980M is a much better choice, and if the situation with the M295X ever reaches a point where it tarnishes the iMac brand reputation in the *general* market, that's serious.
 
Well why stop there? Why not just demand a Skylake CPU right now, and in production quantities? It doesn't work that way.

Re that "small delay", the Mac business alone is $6 billion a year. If they had to delay one quarter, that could equate to a lot of money. There's also a knock-on effect of all other products sync'd to that release, which would also require delaying.

Re reputation = money, is there any indication *whatsoever* this has hurt Apple's reputation in the *general* iMac market -- NOT the enthusiast market? Everything is a tradeoff. Don't think your own needs and desires are perfectly representative about the general market. As the famous quote says, "it's not about you".

That said, I fully agree the 980M is a much better choice, and if the situation with the M295X ever reaches a point where it tarnishes the iMac brand reputation in the *general* market, that's serious.

I think maybe we're misunderstanding each other. Apple has known for a long, long time that the Retina iMac was coming. After all, like you said, these things take an awfully long time to put into production, testing etc, before shipping them out to whiners like me.

But I don't believe for a second that Apple didn't have the resources or clout to make a GPU happen that's better than what ended up with in the M295X. Not for a second. I'm not saying it had to be a 980M. That's obviously an existing GPU now that we can compare against, but again, it really could have been something else completely! Perhaps an MX 870GTX or some such variant that exists nowhere else except the iMac.

Again, this is a TWO-year newer computer than my relegated 2012 iMac, without a GPU that's two years better! That's all it comes down to.
 
....But I don't believe for a second that Apple didn't have the resources or clout to make a GPU happen that's better than what ended up with in the M295X. Not for a second. I'm not saying it had to be a 980M....it really could have been something else completely! Perhaps an MX 870GTX or some such variant that exists nowhere else except the iMac....

There really weren't that many choices. There's AMD's best laptop offering the M295X. There's nVidia's best which is the 980M. Apple could have begun designing and fabricating their own discrete GPU, but that would have taken years, and they'd never do it for such a niche market.

As big as Apple is they can't just say "make it faster", or "please use 40% less power and produce the same performance." If they could demand changes on that scale they'd have made Intel put the Quick Sync hardware transcoder on a custom Xeon just for Mac Pros.

It's obvious today nVidia's Maxwell GPU was a big step forward. That wasn't obvious (possibly even to Apple) six months ago. The chip didn't exist then except as an early design.

It's conceivable Apple signed a deal with AMD six months ago to supply the retina iMac GPU based on their perception at that time of projected AMD vs nVidia price, performance, and availability. By the time both GPUs materialized into actual hardware, it may have been too late to change.
 
At the time when Apple had the GTX 680MX, there were Mobile GTX 600 series out everywhere already. Maxwell came out too late for that to be in consideration. But I still hold on to the belief that Apple is not on good terms with Nvidia seeing as they didn't bother putting in GTX 800 series in the rMBP while continue to use the latest and greatest from Intel and now AMD.

I know you feel disappointed mate that iMac 5K is not significantly better in every way from iMac 2012. But then desktop computers haven't seen such leaps in years. Desktop GTX 980 is not THAT MUCH better than GTX 780 which by itself not THAT MUCH better than GTX 680. Nvidia and AMD have been reusing chips for 3 years! Finally we just now start getting proper new chips and design changes. The big changes are coming but in the GPU world we're still stuck at 28nm for now.

The R9 m295x is not as powerful as GTX 980m nor generate lower heat output like Nvidia GPUs from 2012 onwards. But really, nobody's getting significantly better experiences here. Seeing as GTX 980m still won't let you play above 1440p Ultra which we've also been able to do since GTX 680mx. Sure it'll perform at 4K just not with a lot of recent games with eye candy turned on, the same thing could be said about R9 m295x. 4K gaming on mobile GPU is still years away. Ok, your iMac might make less noise while gaming but come on, how noticeable is that if you've got decent headphones or speakers on? Hell, would nobody point fingers at Intel for making i7 4790k runs hotter than i7 4770k?

The only huge and major shift is in display resolution which we've got by heaps.

Cheers,
 
Don't forget that AMD also provides custom designed GPUs for the Mac Pro. Such deals are rarely struck in a vacuum - if you look at 2011, all Macs released in that year have AMD GPUs.
So of course we can speculate forever why Apple didn't put the best currently available mobile GPU in the 5k iMac, but the reason could be as simple as a deal with AMD struck many months ago.
 
What do you think guys... I work on the attic. There is my 'office' where I work 10 - 15 hours a day. In the summer the temperature rise to 30+ degrees. My iMac 2011 sometimes lags a bit because the overall temperature. I am worried when i go for a almost maxed out 5K imac that this can be really a problem. How about the temperatures you peeps facing? Any worries?
 
What do you think guys... I work on the attic. There is my 'office' where I work 10 - 15 hours a day. In the summer the temperature rise to 30+ degrees. My iMac 2011 sometimes lags a bit because the overall temperature. I am worried when i go for a almost maxed out 5K imac that this can be really a problem. How about the temperatures you peeps facing? Any worries?

I wouldn't worry about it. The thin iMacs are much more efficient at cooling than the fat iMacs. I live in Thailand and might not always have my air-con on and the maxed out iMac 5K internals are at around 45 degrees with fans at 1,200 RPM so pretty much silent (room temperature probably around 28 degrees most days since I've got the iMac). I do put air-con on while gaming as always and I set the temps at chilling 25 degrees (hey, that's pretty cold in Thailand), the fans does spin up to 2,000 somethings depending how demanding the game is but no throttling so far. Anyhow, you have nothing to worry about, it's impossible for the iMac 5K to be louder and hotter than iMac 2011, I had the iMac 2009 and those things really do run hot but that iMac survives to this day.
 
Hey GUYS, please help!

I just read the thread from the beginning (2 mos. ago, wow!) and want to know just one thing:

Is the lag and stuttering problem in OS X and the occasional Call of Duty gaming going to go away in your opinions, if I go with the M295X over the M290X? Would getting the i7 make a difference?

I'm looking to re-purchase in the next couple of days. My base iMac had terrible stuttering issue even browsing the 5K iMac section at apple.com. Seriously.

And I also noticed the process "WindowServer" a lot, which was handled by the CPU, not the GPU, correct?
 
Ok, your iMac might make less noise while gaming but come on, how noticeable is that if you've got decent headphones or speakers on?

A hand grenade isn't loud if you have proper hearing protection. Seriously, arguments like these are really just admissions that the machine is too loud.
 
A hand grenade isn't loud if you have proper hearing protection. Seriously, arguments like these are really just admissions that the machine is too loud.

No, what I said is that it wouldn't be noticeable at all if you use those things on, which you should while gaming anyways. Good to know that there are people who game with sound off, being in complete silence! I'm not sure where I posted it, but the iMac 5K is in most instances quieter than my air-con so it's not high up in the list of noise-makers in my room. Plus, normal temps with air-con on without gaming is 36 degrees, woot!
 
Hey there!

I was considering buying a 5K iMac, more like in autumn really, but still.
This thread seeded some concerns inside me and because I'm not quite as fluent in terms of the technical stuff as you guys appear to be and I want a computer costing so much money to be exactly perfect, I decided to ask some questions. Hope I will find some professional iMac users here, not just gamers.

At the beginning, apologies if anything I ask has already been covered here, I got through the subject thoroughly, but got stuck as far as on site number ten.

I do not consider gaming on my yet hypothetical iMac. What I know I will do for sure is hi-res photo editing and 4K video editing, probably using Final Cut Pro (let's call it FCP). Other things that might be wanted to be done with this computer are probably some basic CAD projects, but not quite sure about that (depends on many things connected with my future of education within a year from now). So my question #1 is why you, gamers, say that the professional job like editing 4K-res videos is less demanding for GPU than gaming? I mean, graphics for professionals are far more expensive than graphics for gamers (just look at the price of the best NVidia Quadro on the market today). Additionally, if you look through FCP website you can see the newest version has been specially optimised to handle TWO high-end professional graphics installed inside the Mac Pro. My point is that if FCP didn't demand high graphics performance, nobody would be that fussed about all the optimisation.

Now there are two possibilities. (A) you agree with above or (B) you don't.
If (A): Has anybody done research to find if throttling is an issue on Yosemite as well, or it is Windows-only teritory?
If (B): First of all, why? Not because I want to have an argument. I want to see the point, I may even admit I was mistaken ;) Secondly, maybe if the job I want my iMac to do is not that tough, I shouldn't bother at all and buy the i7 with M290X?

Could anyone submit the GPU temperatures during FCP hard work and stuff?
 
Hey there!

I was considering buying a 5K iMac, more like in autumn really, but still.
This thread seeded some concerns inside me and because I'm not quite as fluent in terms of the technical stuff as you guys appear to be and I want a computer costing so much money to be exactly perfect, I decided to ask some questions. Hope I will find some professional iMac users here, not just gamers.

At the beginning, apologies if anything I ask has already been covered here, I got through the subject thoroughly, but got stuck as far as on site number ten.

I do not consider gaming on my yet hypothetical iMac. What I know I will do for sure is hi-res photo editing and 4K video editing, probably using Final Cut Pro (let's call it FCP). Other things that might be wanted to be done with this computer are probably some basic CAD projects, but not quite sure about that (depends on many things connected with my future of education within a year from now). So my question #1 is why you, gamers, say that the professional job like editing 4K-res videos is less demanding for GPU than gaming? I mean, graphics for professionals are far more expensive than graphics for gamers (just look at the price of the best NVidia Quadro on the market today). Additionally, if you look through FCP website you can see the newest version has been specially optimised to handle TWO high-end professional graphics installed inside the Mac Pro. My point is that if FCP didn't demand high graphics performance, nobody would be that fussed about all the optimisation.

Now there are two possibilities. (A) you agree with above or (B) you don't.
If (A): Has anybody done research to find if throttling is an issue on Yosemite as well, or it is Windows-only teritory?
If (B): First of all, why? Not because I want to have an argument. I want to see the point, I may even admit I was mistaken ;) Secondly, maybe if the job I want my iMac to do is not that tough, I shouldn't bother at all and buy the i7 with M290X?

Could anyone submit the GPU temperatures during FCP hard work and stuff?

I don't know much about video editing but as I understand it those professional cards have optimised drivers and have features that doesn't exist on consumer cards that benefits those pro apps. Also helps that pro apps are much better optimised than games, i.e. FCP would efficiently and effectively utilise an i7 or Xeon and whatever OpenGL GPU (or GPUs!) you throw at it. As for gaming, these days games don't even perform well from release date (in some cases never), and the only way around that is to throw more computing power into it which sometimes the games would still run awfully. This is because the games are built for gaming consoles most of the time; only optimised for Xbox and PlayStation hardware configs, then ported (probably shoddily) to PC and then Macs.

Now onwards to photos. The first time I've ever experienced anything close to this is when I first got an rMBP, seeing almost half the pixels in the RAW files was awesome. Now we have 15 megapixels to play with. About 3/4 pixels of my RAW files now. Compared to video editing, working with photos is not particularly demanding. But significantly more content on your display is even greater.
 
i.e. FCP would efficiently and effectively utilise an i7 or Xeon and whatever OpenGL GPU (or GPUs!) you throw at it.
First of all, thanks for answering. You're right, but still the GPU will run at full load, (as I found out reading page 12 ;) ) it'll just be more efficient, meaning the full load will give more result, or in fact the same result quicker. So the question if throttling is an issue on Yosemite is still not answered and I'm not sure it can be, due to lack of measuring software.

But I think I might present a new point of view on the subject. I do agree that there is an issue with too much thermalpaste. When I had a desktop PC a few years ago, or indeed more than one that I had in maintenance, I discovered that the difference is... considerable. But in my view there is a more complex design mistake.

pYHRZpMG6v3p6uIL.huge

Image from ifixit.com.

Above you can see an iMac with Retina Display... dismounted. You'd have to be a lunatic to place a fan this way! The air comes from the very thin bottom end of an iMac, has to literally squeeze under the mainboard to be sucked by the fan at the angle of pi/2 (or indeed 90 degrees) through a gap that is... I don't know how much thinner the fan is than the inside of the computer, but not very. Then the little amount of air which accomplished the 'mission impossible' of getting into the fan has to come DOWN as it gets warmer, which is not quite the direction that laws of physics show us. All the above result in a very poor air volume passing the colling system in one unit of time. The GPU cannot cool itself, because it doesn't meet the air it should give energy. Why did Apple expect it to work properly at all?

And my graph of the air flow below. Seriously, if I was air I would have commited Sepuku three times before I could get past all the obstacles in the iMac.

iR6NT8e3u5May.jpeg
 
Last edited:
First of all, thanks for answering. You're right, but still the GPU will run at full load, (as I found out reading page 12 ;) ) it'll just be more efficient, meaning the full load will give more result, or in fact the same result quicker. So the question if throttling is an issue on Yosemite is still not answered and I'm not sure it can be, due to lack of measuring software.

But I think I might present a new point of view on the subject. I do agree that there is an issue with too much thermalpaste. When I had a desktop PC a few years ago, or indeed more than one that I had in maintenance, I discovered that the difference is... considerable. But in my view there is a more complex design mistake.

Image from ifixit.com.

Above you can see an iMac with Retina Display... dismounted. You'd have to be a lunatic to place a fan this way! The air comes from the very thin bottom end of an iMac, has to literally squeeze under the mainboard to be sucked by the fan at the angle of pi/2 (or indeed 90 degrees) through a gap that is... I don't know how much thinner the fan is than the inside of the computer, but not very. Then the little amount of air which accomplished the 'mission impossible' of getting into the fan has to come DOWN as it gets warmer, which is not quite the direction that laws of physics show us. All the above result in a very poor air volume passing the colling system in one unit of time. The GPU cannot cool itself, because it doesn't meet the air it should give energy. Why did Apple expect it to work properly at all?

And my graph of the air flow below. Seriously, if I was air I would have commited Sepuku three times before I could get past all the obstacles in the iMac.

The thing is, the previous GPUs of this iMac design actually ran cool and quiet. This thermal design wasn't ready for AMD hardware which typically ran noticeably hotter than recent Nvidia GPUs. Unlike the nMP, an engineering marvel which is designed from the get go to be a jet cylinder of cool. The new CPU i7 4790k ran hotter than the previous i7 4770k. The new GPU R9 m295x ran hotter than GTX 780m. While the cooling system is doing ok with normal workload, with heavy workload the fans do kick in sooner. Why they decided to stick with the same thermal design is beyond me... They're probably waiting for the iMac redesign.
 
The thing is, the previous GPUs of this iMac design actually ran cool and quiet. This thermal design wasn't ready for AMD hardware which typically ran noticeably hotter than recent Nvidia GPUs.
It means that the thermal design simply is not good enough, because the AMD GPU used in AMD card for PCs doesn't throttle, as I read in a link found in this thread.

They're probably waiting for the iMac redesign.
And so should we, I'm afraid.

@below: In short: cooling system is not good enough. Exactly the point I made. ;)
 
Last edited:
I have been following this thread for weeks now and still hesitating to buy or not to buy the current late 2014 iMac 5K (especially I need it for the Windows gaming machine via bootcamp as well, and the 105-107 C temps with tremendous throthling issue is just a major turnoff with the new iMac - and makes me to wait for the possible next 2015 model).

Debating about should the Apple have chosen 980M over m295X is just a speculation - and the reasons for the selection has been stated numerous times. The AMD is the GPU provider in Apple products for now, and possibly not changing that decesion with 2015 model either.

But the real issue here is that Apple should have foreseen the thermal output of AMD's m295X and redesign the cooling system of iMac to properly handle the heat output that chip (which thermal output may lie somewhere between 120-150W).

It is just normal that mid range desktop GPU cards (comparable to high end mobile GPU) that output 120-150W of TDP have 6-8 heatpipes and huge heat sinks with 2-3 fans to output the heat.

For example here you can see the Asus version of the 285x desktop GPU that should be exactly comparable to m295X:
http://www.pcper.com/news/Graphics-Cards/AMD-Announces-Radeon-R9-285X-and-R9-285-Graphics-Card

These cards have huge thick heat sinks that may be as long as the card (nearly 30cm) and they can take the heat and dissapate that keeping the GPU ca. 80C on full load for hours.

What Apple did here was just fitted the i7 outputting 95-105W plus m295X outputting something like 120-130W (totalling well over 200W TPD) to a single small heatsink that utilizes a single (1) heatpipe from CPU and a single heatpipe (1) from GPU. See it for yourself from the iFixit teardown pictures:

https://www.ifixit.com/Guide/iMac+Intel+27-Inch+Retina+5K+Display+Heat+Sink+Replacement/30523

(Just scroll down the page to the bottom on steps 59 and 60).

What the "F" is that??! There is just a tiny heatsink that tries to dissipate the heat generated from CPU and GPU. A single puny fan whirling 2800 rpm and sucking hot air from inside of the iMac, already cramped air space.

This is not something that can be fixed replacing thermal paste with proper silver compound (or what even crazier ideas suggests, with some magical driver updates). This is also not a matter of Windows or OS X. This is a matter of hardware. It occures also on OS X if GPU is pushed on full load.

This is just madness from Apple, that they did not redesigned the the heat sink system whole over for the needs of the new CPU and GPU. There should have been at least 3 proper heat pipes from CPU and 3 proper heat pipes from GPU and bigger heat sink and fan to cool down these components properly.

When you look that cooling system there is absolutely no question about is this sufficient for the chosen components (i7 + m295X). It is not. And it's absolutely no wonder that GPU heat ramps up to 105 C in just a couple of tens seconds on full load test. Just a single heat pipe is not enough to dissipate the heat to the heatsink, which is also undersized.

This is something beyond repair untill Apple recalls all the 1gen iMac 5k's and replaces the heat sinks with redesigned models (if even possible considering the thickness of the iMac's). Why I am not believing that this would happen.

Oh Apple, why did you failed us (again)?
 
Last edited:
Thought I'd add my experience to this thread. Please excuse my lack of technical know-how.

I've been doing a little gaming on BootCamp on my 5K iMac (i7, M295X) in the past few days. I've come from a 1GB Radeon HD5750 card, in a 2010 iMac. This was a big upgrade for me.

My M295X got to 108C, or 226F, this evening, whilst the processor cores struck 98C, or 208F. I ran the game (America's Army) for about 2 hours at 2560x1440 with high settings. It never dropped below 45 FPS that I saw, and was generally sitting at 65-85. That said, where a scene was 75-85 FPS in the first few minutes, the same scene saw 55-65 FPS after about 20 minutes. I can't tell for sure that this was due to throttling as it's a very crude test, but it is suggestive.

What I did find fairly impressive, though, was how damn quick the thing cools components down to about 50-60C. From 100C, you get to 60C if you quit the game within around 10-15 seconds. Also, my 2010 iMac used to be extremely hot to the touch on the back aluminium after gaming. My 5K stays cold - actually cold - besides the small rear vent, which is slightly warm.

It's a huge shame, and I'm extremely tempted to return this iMac. It's just failed to deliver the usual 'delight' that other Apple products do so well. The screen, however, is just stunning.
 
I still don get it: what exactly is the problem? That the GPU reaches 108C under heavy load?
So what?

As long as that doesn't affect the imac performance, and it does not affect it as I can play warcraft for hours at 3K resolution on ultra settings with 60 FPS steady (and at 4K still playable), forget about the temperature. Even at 108C it is not as loud as a gaming PC with huge heatsinks and six fans and everything.

On the other side if we speak of work (not games) I am still waiting for somebody complaining about the impact in real world of the hot gpu on the imac performance. Just uninstall istat menus and use your imac instead of obsessing on the gpu temps.
 
I still don get it: what exactly is the problem? That the GPU reaches 108C under heavy load?
So what?

As long as that doesn't affect the imac performance, and it does not affect it as I can play warcraft for hours at 3K resolution on ultra settings with 60 FPS steady (and at 4K still playable), forget about the temperature. Even at 108C it is not as loud as a gaming PC with huge heatsinks and six fans and everything.

On the other side if we speak of work (not games) I am still waiting for somebody complaining about the impact in real world of the hot gpu on the imac performance. Just uninstall istat menus and use your imac instead of obsessing on the gpu temps.

Then again most top end R9 GPUs seem to operate at 100+ degrees anyways and most of them are still going strong.
 
Also, my 2010 iMac used to be extremely hot to the touch on the back aluminium after gaming. My 5K stays cold - actually cold - besides the small rear vent, which is slightly warm.

In fact, this is bad. The metal case of the iMac is actually a part of the cooling system, you may treat it as a giant heatsink. When the computer works hard, it should be hot (or at least warm). Otherwise, it means the heat was trapped inside the case (not a effective cooling system).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.