Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No there absolutely is not. My fans were running at 2300rpm. You can actually hear them running, unless I muted the video (can't remember).

There is ABSOLUTELY no way the fan on Windows is stuck at 1200rpm. The flipping thing would shutdown in a matter of minutes if that were the case.

Anyone with a 5K iMac can prove that the fan spins up. A chunk of this entire thread has been about the fan speed and heat on WINDOWS. If your fans aren't spinning up on Windows, it's YOU with the faulty iMac.

I look forward to your video.

Edit: just noticed your previous post actually highlights the fact that your fan is above 1200rpm. I don't know what you're talking about anymore. It's got too far-fetched.
If the fan got stuck to 1200 in a matter of minutes it start throttling around 106/7/8 but not shutting down, like yours.

But why 2300 ? the maximum is 2700 !

I think you are a bit confused... :confused:

And give me a couple of hours and I'll post the video
 
AMD Radeon R9 M295X Core Clock Throttling, Heat, and Performance

Thanks William.

Astelith, I honestly don't know what you're talking about any more. Just watch my video. You can literally hear the fan spin up and down with the game.

William also commented and posted a screenshot.

The maximum fan speed that ALL of the screenshots so far have shown, with the exception of yours funnily enough, has been 2300rpm.
 
Are there any problem when working with layers heavy photoshop composition or is this a problem only with games? How about in Logic Pro X?

Because if this is limited to games I may be tempted in replacing my 2010 i3 iMac sooner than expected.
 
Are there any problem when working with layers heavy photoshop composition or is this a problem only with games? How about in Logic Pro X?



Because if this is limited to games I may be tempted in replacing my 2010 i3 iMac sooner than expected.


A game is simply a GPU-intensive application. Any similarly intensive process or app will cause the same effects.
 
A game is simply a GPU-intensive application. Any similarly intensive process or app will cause the same effects.

Games tends to tax GPU more. While I don't game on mac I do game heavily on PC. My GTX 660 Ti in my Dell is near silent in Photoshop, 3DSMax and Blender but If I launch WoW and play a while you can hear it across the room.
 
On my late 2013 iMac, when I'm playing 3D games on Windows 8.1, fans rises to 2300 or 2700 rpm, I can hear them.
 
There are a lot of uneducated people posting on this thread, but I won't call anybody out by name.

Regarding the temperature of the M295X. Why do you think at 105c it's running too hot?

1. You stuck your finger in the air, selected a random number and unilaterally decided it's too high.

2. It sounds hot because water boils at around that temp.

3. You are stuck in the past when these temps would have been too high.

You're probably thinking the same thing about me, 'what makes you think they're not hot?'. I took the time to learn about it. I'll make it easy for you since you obviously can't be bothered to make the same effort, here is a link to a well researched article that I implore you to read:- http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/5

I will definitely trust a respected resource like this more than I will trust some stranger spouting off in a message forum without backing up their statements.

AMD's current chips are designed to run hot, they even use this temperature differential to take advantage of Newton's Law of Cooling. The paradigm has changed, make sure you're keeping up with developments.

Saying all that I am not blind to opposition arguments. I do not know whether running at these temps are healthy in the long term. But you can't answer that either, only AMD will know those figures. But they're obviously comfortable enough with failure rates to push forward on this path.

I also note that the article quotes 95c as an upper limit, but consider this article is 15 months old (a long time in silicon development) and the M295X is a newer architecture which could easily explain the difference.
 
I'd just like to add this:

I have a high end 5K iMac, i7/295 etc. I run Windows 8.1 for some games, and I just installed Macs Fan Control to test your theory of the fan not speeding up past 1200rpm...

And my findings? Nonsense. Just fired up some Diablo 3, and the fans eventually found their way to 2300rpm, which APPEARS to be Apple's lock for max fan speed. Perhaps the fan could hit max speed (2700rpm), but it looks like that would probably be a really worst-case scenario if that happened, and it's never happened for me under OS X or Windows 8.1. I'd recognize that sort of fan noise as it's quite loud (when manually set to 2700rpm).

Here's a screenshot, as well, as proof.

There's NO issue with gaming in Windows 8.1 on a 5K iMac.

FYI, it was really taking a long time for the fan to hit 2300rpm at 2560x1440 in Diablo 3 because the GPU wasn't being hammered, but when set to 3840x2160 (the max res Windows 8.1 currently supports) the fan got to 2300rpm in under 10 seconds, so it was a simple test.
I'm talking about Windows 10, bootcamp driver apparently are not yet updated for this version.

The fan at 2300 is not the "apple locks" but the ratio to keep the GPU in thermal spec, and with your screen you just prove that you have a functional iMac, temp and fan speed under normal values...

----------

Thanks William.

Astelith, I honestly don't know what you're talking about any more. Just watch my video. You can literally hear the fan spin up and down with the game.

William also commented and posted a screenshot.

The maximum fan speed that ALL of the screenshots so far have shown, with the exception of yours funnily enough, has been 2300rpm.

I've put a video on vimeo, 30 min and i can post it
 
The fan sensor of GPU-Z is not working with the 5K and in Windows there is an issue preventing the fan spin over 1200rpm without any fan control software


I'm talking about Windows 10, bootcamp driver apparently are not yet updated for this version.

All I saw was the above quote where you just said "Windows." If you had said Windows 10 in that post, it would have made life a lot easier.

Honestly, and maybe I speak for myself - but Windows 10 isn't even REMOTELY ready for public consumption at this point, so... honestly... who cares what RPM the fans run? Apple does not support Windows 10, so I'm not sure why this thread is even dwelling on fan speeds in that OS.
 
All I saw was the above quote where you just said "Windows." If you had said Windows 10 in that post, it would have made life a lot easier.

Honestly, and maybe I speak for myself - but Windows 10 isn't even REMOTELY ready for public consumption at this point, so... honestly... who cares what RPM the fans run? Apple does not support Windows 10, so I'm not sure why this thread is even dwelling on fan speeds in that OS.

Windows was used as one of the test, here the video in Yosemite :

https://vimeo.com/118953434

What can I prove with that ?

- The system is designed to run the GPU at 99°, not more and not less
- Setting the fan at maximum (2700) the heat system can easily cool the GPU below 90° under 100% load (yes, in the video was 93 but if you wait it keeps going down)
- The fan start to spin at 99°C in order to keep this temp, on purpose.
- My turtle is cool :cool::D

But obviously even this video is a fake ;)
 
Windows was used as one of the test, here the video in Yosemite :

https://vimeo.com/118953434

What can I prove with that ?

- The system is designed to run the GPU at 99°, not more and not less
- Setting the fan at maximum (2700) the heat system can easily cool the GPU below 90° under 100% load (yes, in the video was 93 but if you wait it keeps going down)
- The fan start to spin at 99°C in order to keep this temp, on purpose.
- My turtle is cool :cool::D

But obviously even this video is a fake ;)

I'm really not sure what your video proves? That a windowed World of Warcraft, not even a "focused" WoW keeps the temperatures down? I see the exact same thing with Diablo 3, by the way. My GPU peaks at the occasional 106C at 2300rpm (usually 105C) when a game is actually full-screen, properly running.
 
I'm really not sure what your video proves? That a windowed World of Warcraft, not even a "focused" WoW keeps the temperatures down? I see the exact same thing with Diablo 3, by the way. My GPU peaks at the occasional 106C at 2300rpm (usually 105C) when a game is actually full-screen, properly running.

Screenshot, check.
Video, check.

Now the next step is an on site visit here... Be my guest lol
 
Screenshot, check.
Video, check.

Now the next step is an on site visit here... Be my guest lol

No, I don't dispute what you're seeing. I see the same thing on my system. What I'm saying is running your system in a window like that is not taxing your computer the same way as full screen, full-res applications do.
 
No, I don't dispute what you're seeing. I see the same thing on my system. What I'm saying is running your system in a window like that is not taxing your computer the same way as full screen, full-res applications do.

Full screen or windowed is the same, and I already prove that weeks ago
 
Here's borderlands 2 runnig max spec: http://i.imgur.com/OV4JxDm.jpg?1
OV4JxDm.jpg


My issue is that the late 2012 iMac could run this at the same spec and stay quiet for the most part - what's causing the GPU in this case to think it needs to push it so hard/hot/loud (gigedy)?

Also, how 'bad' is the M290X? Even though it's 'worse' would it be able to run a game like that on max spec, like how my (maxed) late 2012 iMac could? They both have 2GB GFX Ram the 2012 one is the NVIDEA GTX 680MX. So, assuming THAT one was fine for me, would the much newer M290X be better/ the same?
I assume the only reason it would have difficulty is because it also has to run the 5K screen - how much extra pressure does that put on it? Please don't start talking to me in gigawatts and lightyears, I just want to know if it will run at least the same as the old one without slowing to a standstill...
As much as I'd love the M295X I can't handle that over the top fan speed. It's necessary, and distracting.

It's all very well people saying 'its fine to run that hot' but why would Apple allow a GPU to run the fan distractingly loud compared to previous models? The boost in performance surely isn't worth it...
 
Full screen or windowed is the same, and I already prove that weeks ago

Mind linking to that? I skimmed but probably overlooked it.

Google "fullscreen vs windowed fullscreen FPS". This has been an active topic that varies from system to system and game to game since the dawn of computer gaming.

ESO for example, I have an average of ~20 FPS better (and obviously cooler GPU) in window mode vs Fullscreen. Very significant difference to the point I only ever play in Window mode.

Rarely have I seen a game that didnt vary one way or the other. Even the minor variations in resolutions (top and bottom not gaming) typically have an effect of some sort.
 
Another question regarding fan speed - so I just did something a tad 'dangerous' and used Macs Fan Control to keep the fan at 1200 whilst running the game. The GPU got to 108 but didn't get worse than that. Presumably throttling of some kind kicks in... but the game seemed to carry on running fine.
Anyway, question is, maybe I could get away with reducing the max fan speed by 300 or so - that would make it acceptably quieter and probably wouldn't make much difference to the GPU diode temp from what I can tell - ho much worse can 108 degrees be to 105 degrees?
 
Here's borderlands 2 runnig max spec: http://i.imgur.com/OV4JxDm.jpg?1
Image

My issue is that the late 2012 iMac could run this at the same spec and stay quiet for the most part - what's causing the GPU in this case to think it needs to push it so hard/hot/loud (gigedy)?

Also, how 'bad' is the M290X? Even though it's 'worse' would it be able to run a game like that on max spec, like how my (maxed) late 2012 iMac could? They both have 2GB GFX Ram the 2012 one is the NVIDEA GTX 680MX. So, assuming THAT one was fine for me, would the much newer M290X be better/ the same?
I assume the only reason it would have difficulty is because it also has to run the 5K screen - how much extra pressure does that put on it? Please don't start talking to me in gigawatts and lightyears, I just want to know if it will run at least the same as the old one without slowing to a standstill...
As much as I'd love the M295X I can't handle that over the top fan speed. It's necessary, and distracting.

It's all very well people saying 'its fine to run that hot' but why would Apple allow a GPU to run the fan distractingly loud compared to previous models? The boost in performance surely isn't worth it...
Thanks for bringing something serious to discuss here, and honestly i found it strange because I'm quite sure there's something wrong in the fan speed or in the sensor, and considering that you hear the the fan and calling it loud (and at 2300 we cannot say "quiet") it means to me that your sensor is reporting 5-6° more than the real temp, this can be the only rational explanation, even because above 107° you should notice a big framerate drop.




----------

Mind linking to that? I skimmed but probably overlooked it.

Google "fullscreen vs windowed fullscreen FPS". This has been an active topic that varies from system to system and game to game since the dawn of computer gaming.

ESO for example, I have an average of ~20 FPS better (and obviously cooler GPU) in window mode vs Fullscreen. Very significant difference to the point I only ever play in Window mode.

Rarely have I seen a game that didnt vary one way or the other. Even the minor variations in resolutions (top and bottom not gaming) typically have an effect of some sort.

Sure, let me find it or in case I'll make another test from scratch

----------

Another question regarding fan speed - so I just did something a tad 'dangerous' and used Macs Fan Control to keep the fan at 1200 whilst running the game. The GPU got to 108 but didn't get worse than that. Presumably throttling of some kind kicks in... but the game seemed to carry on running fine.
Anyway, question is, maybe I could get away with reducing the max fan speed by 300 or so - that would make it acceptably quieter and probably wouldn't make much difference to the GPU diode temp from what I can tell - ho much worse can 108 degrees be to 105 degrees?
And this is strange, 108° was the maximum temp I got with the fan at 1200, the GPU at this temp cut badly the voltage and the clock to stay cool.
But why we have the max temp but not the same target temp ? and why your fan is not passing the 2300 to keep the GPU at 100° ?
This is the real question and you should do a SMC reset and if is not getting better call Apple for a check.

I don't recommend to force the fan to run below the original system assigned ratio, maybe is not dangerous but for sure is not optimal
 
Last edited:
Mind linking to that? I skimmed but probably overlooked it.

Google "fullscreen vs windowed fullscreen FPS". This has been an active topic that varies from system to system and game to game since the dawn of computer gaming.

ESO for example, I have an average of ~20 FPS better (and obviously cooler GPU) in window mode vs Fullscreen. Very significant difference to the point I only ever play in Window mode.

Rarely have I seen a game that didnt vary one way or the other. Even the minor variations in resolutions (top and bottom not gaming) typically have an effect of some sort.

But regarding the FPS it's I think application related if it's an improve or a decrease but when the GPU is at 100% the temp is the same fullscreen or windowed, like running furmark in a small windows can push it to the limit.
The system should not go above 100/101° at worse the fan runs faster, in a healthy iMac.
Tomorrow I'll link you some comparison.
 
But regarding the FPS it's I think application related if it's an improve or a decrease but when the GPU is at 100% the temp is the same fullscreen or windowed, like running furmark in a small windows can push it to the limit.
The system should not go above 100/101° at worse the fan runs faster, in a healthy iMac.
Tomorrow I'll link you some comparison.


I wasn't thinking the temps will likely be static on both regardless of window vs fullscreen because prior year models could run the GPU @ 100% load with virtually no fluctuation in fan speed. Basically the cooling capacity far exceeded the GPU's capacity for heat generation. So running a game on ultra high settings or until there is a noticeable drop in frame rate might get the GPU to exceed 90c @ 1200 RPM.

When gaming I'll set the fan to 1700 RPM (about as fast as I can and still not hear it) to maintain 75-85c max GPU temp.

Anyway I guess the question would be what RPM fan do you see between window and fullscreen. And what is the difference between FPS and temp if any at all?
 
There are a lot of uneducated people posting on this thread, but I won't call anybody out by name.

Regarding the temperature of the M295X. Why do you think at 105c it's running too hot?

1. You stuck your finger in the air, selected a random number and unilaterally decided it's too high.

2. It sounds hot because water boils at around that temp.

3. You are stuck in the past when these temps would have been too high.

You're probably thinking the same thing about me, 'what makes you think they're not hot?'. I took the time to learn about it. I'll make it easy for you since you obviously can't be bothered to make the same effort, here is a link to a well researched article that I implore you to read:- http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/5

I will definitely trust a respected resource like this more than I will trust some stranger spouting off in a message forum without backing up their statements.

AMD's current chips are designed to run hot, they even use this temperature differential to take advantage of Newton's Law of Cooling. The paradigm has changed, make sure you're keeping up with developments.

Saying all that I am not blind to opposition arguments. I do not know whether running at these temps are healthy in the long term. But you can't answer that either, only AMD will know those figures. But they're obviously comfortable enough with failure rates to push forward on this path.

I also note that the article quotes 95c as an upper limit, but consider this article is 15 months old (a long time in silicon development) and the M295X is a newer architecture which could easily explain the difference.

You need to read the article again, slowly.

Here is a great summation:

"Ultimately there is always going to be a longevity cost to increasing temperatures..."

He then goes on to say that warranty and longevity issues are AMD's worry.

But in this case they are not. The GPU is soldered right there on the logic board. And even if by some miracle of modern engineering AMD has rolled back the laws of physics on their GPUs, I doubt very much that applies to all of the other electronics on the logic board.

So, we have a line of computers that has historically had heat problems and we now add in a GPU running hotter than....well, 10 C hotter than an R9 290X apparently, what do we expect to happen?

There is no amount of spin to put on this that means internal temps over 220 F make good sense in a computer. There isn't.

Basically, AMD is losing both the CPU and the GPU race. The only way they can make their tired GPUs even moderately competitive is to crank them to insane speeds. Apple was desperate to make performance somewhat useable so they let AMD run the thing at "11". Anyone who doesn't buy AppleCare on this machine should go ahead and start looking for a new logic board.

BTW, I have a buddy who is unloading his 2010 MacBook Pro RIGHT NOW. Why? Because it's AMD GPU has boiled itself to death. Did he ever run a game? Nope. Just web browsing, Youtube stuff, standard usage. He never even had Windows on it, and we know there aren't many Mac games that use GPU.

I have been in GPU biz for 10+ years. 105 C is WAY TOO HOT.

If you buy one of these machines, get AppleCare and sell it at 2 years.

Oh, and that stuff about Newton's Laws of Cooling, let me sum it up for you. You're heading up the Tejon Pass, at 5,000 ft you pass an Escalade boiling over, it's radiator water at 260F. Meanwhile in your Subaru, the radiator is only at 200F. Newton's Law of Cooling that is being referenced is saying that the Escalade can give off more heat on a 100F day since it is hotter.

Or, more simply put. A frying pan on the stove heated to red hot can give off more heat than one that is less hot. Even when there is no heat coming from the burner, the hotter pan can give off more heat and thus cool faster. So, do you really want your $3,000 iMac to be like the Escalade/red hot pan and then brag about how easy it is to give off heat? That's not really a good thing.
 
Last edited:
You need to read the article again, slowly.

Here is a great summation:

"Ultimately there is always going to be a longevity cost to increasing temperatures..."

He then goes on to say that warranty and longevity issues are AMD's worry.

But in this case they are not. The GPU is soldered right there on the logic board. And even if by some miracle of modern engineering AMD has rolled back the laws of physics on their GPUs, I doubt very much that applies to all of the other electronics on the logic board.

So, we have a line of computers that has historically had heat problems and we now add in a GPU running hotter than....well, 10 C hotter than an R9 290X apparently, what do we expect to happen?

There is no amount of spin to put on this that means internal temps over 220 F make good sense in a computer. There isn't.

Basically, AMD is losing both the CPU and the GPU race. The only way they can make their tired GPUs even moderately competitive is to crank them to insane speeds. Apple was desperate to make performance somewhat useable so they let AMD run the thing at "11". Anyone who doesn't buy AppleCare on this machine should go ahead and start looking for a new logic board.

BTW, I have a buddy who is unloading his 2010 MacBook Pro RIGHT NOW. Why? Because it's AMD GPU has boiled itself to death. Did he ever run a game? Nope. Just web browsing, Youtube stuff, standard usage. He never even had Windows on it, and we know there aren't many Mac games that use GPU.

I have been in GPU biz for 10+ years. 105 C is WAY TOO HOT.

If you buy one of these machines, get AppleCare and sell it at 2 years.

Oh, and that stuff about Newton's Laws of Cooling, let me sum it up for you. You're heading up the Tejon Pass, at 5,000 ft you pass an Escalade boiling over, it's radiator water at 260F. Meanwhile in your Subaru, the radiator is only at 200F. Newton's Law of Cooling that is being referenced is saying that the Escalade can give off more heat on a 100F day since it is hotter.

Or, more simply put. A frying pan on the stove heated to red hot can give off more heat than one that is less hot. Even when there is no heat coming from the burner, the hotter pan can give off more heat and thus cool faster. So, do you really want your $3,000 iMac to be like the Escalade/red hot pan and then brag about how easy it is to give off heat? That's not really a good thing.

It make sense, but, if you look at the temps in my video, the temps inside the iMac are in good ranges, even the gpu proximity is low.
This make me think about something unusual to the GPU diode sensor, overall this machine is cool, if you touch the chassis after hours of gaming it's cold, slightly warm in the base mount, my old 2011 iMac under the same condition is hot, and a lot more, like you can't keep your hand on the back.
Apple can't afford to have the first retina iMac burning after one or two year, I see this scenario highly improbable, recalling units for a design flaw have an enormous cost in money and corporate image.
 
It make sense, but, if you look at the temps in my video, the temps inside the iMac are in good ranges, even the gpu proximity is low.
This make me think about something unusual to the GPU diode sensor, overall this machine is cool, if you touch the chassis after hours of gaming it's cold, slightly warm in the base mount, my old 2011 iMac under the same condition is hot, and a lot more, like you can't keep your hand on the back.
Agreed- the rest of the machine does stay relatively cool, even the other GPU sensors stay cool. What's going on with the diode? Is anyone else able to check? Are we CERTAIN the 100+degrees is accurate? Is there some way of physically checking? It also seems odd for it to cool down so quickly....air doesnt conduct heat quite like that usually.
Apple can't afford to have the first retina iMac burning after one or two year, I see this scenario highly improbable, recalling units for a design flaw have an enormous cost in money and corporate image.
I don't agree with this. This is a hugely misplaced faith statement. Apple absolutely CAN afford to have the first retina imac burning after 1 or 2 years. Theyve had many many similar huge issues that have required recalls and extensions of warranty etc. You know what else they have? Some of the best PR and media agents available. They also have a massive base of die hard fans, who are happy to have a five minute grumble and then put Apple back onto the pedestal of 'can do no wrong'. A story could probably some out tomorrow where every time Apple sold a retina iMac a puppy was murdered to help make the CPU, and people would still buy anything and everything Apple sells.

Also, is anyone able to help with my previous question of M290 vs M295x - will the less good model still be fine for gaming with an i7 CPU? As mentioned, my top spec 2012 iMac can handle my games max spec, so I assume the newer M290X will still be able to hand that, even if its not as good as the M295X?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.