Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hi,

I monitored the temps of my system, and observed the following:

* CPU maxes out around 89 deg. C.
* GPU hits 101 deg. C after about 10 minutes, then the fan ramps up and it drops to around 97 deg. C where it seems to sit consistently thereafter.

During the drop in GPU temp, no observable loss of performance was noted.
 
Ok lets stop all the anal temp Monitoring the Imac has 4 x more pixels it used the 295 Tonga chipset the first of its kind. Its powerful and yes the fans do go some if playing games, so what ?

but if its actually stopping you buyin it then so be it, but 5k screen is to die for, and using it for photo editing is fantastic.

being a heavy gamer MACs are not good for games end of, the Mac OS just isn't up to it like a PC, Bootcamp and youll see the FPS jump up

Ive tried all things

World of Tanks
Warcraft
StarCraft 2

I also tested a fully loaded Mac Book pro I7 retina and guess what, its overheated and shutdown when gaming, it also throttled a hell of a lot in WOW to cope with the heat..... so its been with us a while this throttling lark,

if you want the crisp screen all in 1 neat Package that can pump out around 60 to 75 FPS in most games at 1920 then this is the 1.

But don't think your gonna game in 5k because most games will run but sub 20FPS......

For most recent AAA games I recommend 1440p on med-high with probably shadows and other demanding stuff low, and push texture to Ultra as you've got 4 Gb VRAM. On well-optimised games you can definitely run at avg. 50 FPS and 30 FPS minimum. Older games runs great in 4K and looks spectacular.
 
Again, you might not 'see' throttling because of what you're doing. Also, without actually monitoring, you're likely going to struggle to notice it.
It's nearly impossible to measure since current CPUs always throttle no matter what. You don't know if it throttles because of heat or because the machine simply isn't needing that much speed at that moment. CPU usage will always have spikes, even when you let the machine idle. Modern GPUs are similar to CPUs when it comes to throttling. You start out at the bottom and performance ramps up when it needs to. Or simply put: performance is on a need to need basis. If you use a power meter you can see this behaviour too when you do CPU or GPU intensive tasks.

It exists, though. Watch this from about 2 minutes in...

http://youtu.be/tgTMxB-ffjM

A similar issue applies with the M295X GPU. Simply read this thread for all the proof points you need.
Did that and ran some other stuff (4K videos) next to it. Temps for both CPU and GPU didn't go beyond 83C nor did the fans kick in. Never noticed any loss of performance.

Things are different when you try to watch 4K @ 60fps. I tried with the current Opera version but this doesn't work all that well. Opera clearly struggles with 4K @ 60fps stressing the CPU (nearly 300% usage) and stutters like there is no tomorrow. Drop it back to 4K @ 30fps and all is well. With gaming you probably get something similar like that.
 
AMD Radeon R9 M295X Core Clock Throttling, Heat, and Performance

It's nearly impossible to measure since current CPUs always throttle no matter what. You don't know if it throttles because of heat or because the machine simply isn't needing that much speed at that moment. CPU usage will always have spikes, even when you let the machine idle. Modern GPUs are similar to CPUs when it comes to throttling. You start out at the bottom and performance ramps up when it needs to. Or simply put: performance is on a need to need basis. If you use a power meter you can see this behaviour too when you do CPU or GPU intensive tasks.


You absolutely CAN measure thermal throttling.

Push CPU to max using a benchmark.
Wait.
If the clock speed suddenly drops, record the temperature X.
Rinse and repeat.

If X is always the same, your CPU is thermal throttling.

In the case of the i7, it throttles at 100C and the M295X at around 105-106C.
 
You absolutely CAN measure thermal throttling.

Push CPU to max using a benchmark.
Wait.
If the clock speed suddenly drops, record the temperature X.
Rinse and repeat.

If X is always the same, your CPU is thermal throttling.

In the case of the i7, it throttles at 100C and the M295X at around 105-106C.

But all i7's will throttle at 100C - that seems to be the designed limits from the quick reading I've done (can't find an official Intel comment on it).

Now, the fact that the iMac gets to 100C too quickly is a different discussion. It's either a case of not enough cooling (the quest to keep things as slim as possible) or using the wrong CPU. But then we all talk that the GPU was essentially that or nothing (as the NVIDIA option wasn't available) - there's plenty of Intel CPUs that could be used. So why this one?

Ultimately someone in Apple signed off on this design rightly or wrongly.... time will tell.

I read somewhere that Apple's tend to work best if your use-cases fit the way Apple intended the machine to be used. Is that a fair statement to make? Are people expecting too much? For my use cases I've yet to see any CPU issues at all so for me it's a non-issue. Whilst I do play games I haven't bought anything new for about 4 years (apart from Humble Bundle deals and the like) and, at worst, I've seen GPU temperature spikes which seem to happen for no real reason and go away again just as quickly.
 
You absolutely CAN measure thermal throttling.

Push CPU to max using a benchmark.
Wait.
If the clock speed suddenly drops, record the temperature X.
Rinse and repeat.

If X is always the same, your CPU is thermal throttling.

In the case of the i7, it throttles at 100C and the M295X at around 105-106C.

After this stuff:

http://hardforum.com/showthread.php?t=1832669&page=2

it looks like Intel CPUs has a maximum Tj of 105 celsius. You can read this out via the PROCHOT register. Appearently the thermal throttling threshold is one degree celsius below Tjmax.

However, maybe Apple lowered Tjmax and this threshold a few celsius to be more on the safe side.
 
You absolutely CAN measure thermal throttling.

Push CPU to max using a benchmark.
Wait.
If the clock speed suddenly drops, record the temperature X.
Rinse and repeat.

If X is always the same, your CPU is thermal throttling.
That is not measuring. What you measure is the frequency and the temperature over a certain amount of time. What you do then is look for a trend and use reason to draw a conclusion. This is more like making an educated guess which is as close as you can get. Measuring is something entirely different.

The problem is accuracy. You are forgetting that the load, even when trying to run at max, is not always exactly the same. The fact that modern CPU/GPUs go up and down in frequency all the time doesn't help. They are really eager to stay at the lowest possible frequency. The operating system will also interfere since it tries to balance things. And then we have the benchmark, it has to generate a fixed amount of load in order to prevent the CPU/GPU from throttling down due to load or the OS interfering. If you measure the frequency you can do that every millisecond, every second, every 10 seconds. What you choose greatly influences on what you get to see. And so on and on.

I never said you couldn't detect it, only that it is nearly impossible to do due to too many things that are at play that you simply have no control over whatsoever. In the end it really doesn't matter. The machine is either fast enough or too slow. If it is the latter you are looking at a new machine anyway.

In the case of the i7, it throttles at 100C and the M295X at around 105-106C.
Which is something you can find in the specs of the cpu.
 
AMD Radeon R9 M295X Core Clock Throttling, Heat, and Performance

*semantics*

Watch from 2:15. It's really quite simple. Throttling is not the same as a dynamic clock speed.

http://youtu.be/tgTMxB-ffjM

I have always said it's a great machine for the majority and it is completely subjective whether or not the throttling affects someone. The fact remains, though, that it does exist.

Which is something you can find in the specs of the cpu.


Yes.
 
The semantics here are getting muddied up.

But alot of that has to do with the changes in speed rating by Intel and GPU makers.

In the past you had a max speed, and that was what the CPU was rated. If it got too hot it "throttled" down.

GPUs used to have 2 or 3 speed states. When needed they would go to highest. It was rare that they needed to throttle down, but every rom I have looked at has a Tmax defined as a place to either throttle or just turn off.

Now everything is rated with a "Boost" clock or a "Turbo" mode.

I recently got a base 2014 Mini to work on eGPUs. It is rated at 1.4 Ghz, much lower than the 2.6Ghz upgrade option. While in Windows I ran the little Intel CPU app I discovered that it was running at 2.0 Ghz. I then left that running while I tested an eGPU in Far Cry4 for 30 minutes. When I quit the game I looked at the recorded history, the CPU had never gone below 2.0 Ghz.

What I feel like is that this particular CPU was rated lower to make the 2.6 Ghz look more "worth it" as an upgrade. I have no idea what a 2.6 usually runs at however.

Same with GPUs, they now have a rated clock and a "Boost" clock that they will go to if needed and heat allows. So what is the true maximum clock? When is it throttled? Hard to say with these ways of defining things.

On the other hand, 100C for a GPU is a guaranteed short life. No ifs, ands, or buts. I have been dealing with GPUs for more than 10 years. The ones that run hot, die early. That simple. GTX470 and GTX480 ran in the 90C range. And after a year or so they started dropping like flies. Same with 7950/70 from AMD. Ran hot, dying young. (Ask my e-waste guy, he's hauled away many of both)

So whatever you call the clock behavior, those 5K iMacs are running too hot. Apple didn't re-write the laws of physics. Their reality distortion field can't do that. And even with a fan blowing heat away, there is radiated heat soaking into the PCB and directly into the back of that 5K panel. I would expect that eventually the area over the GPU will show a color change as the plastics get cooked to a golden brown.

If I had one, I would figure out how to allow that fan to hit 2,700 rpm and turn up the volume on something else to cover the noise.

And I would DEFINITELY buy Applecare.
 
  • Like
Reactions: AlifTheUnseen
For my use (video and photos editing) temps maxed out at 70, but I used to get temps around 105 running benchmarks. Benchmarks uninstalled, problems solved, bought applecare, and now can sleep well. For my needs it's awesome machine but for someone who's running things that can reach high temps should look for something else.
 
Has anyone tried to downclock the m295x a bit? Could that possibly help with the heat and noise?
 
Can anyone please provide me the iMac's M295X vBIOS ROM dumped through GPUZ? Thanks in advance!
 
Anyone seeing high idle temps since 10.10.4 official release? Right now, with casual browsing (no flash), I'm sitting at 64C CPU, and 92C GPU. Seems a bit high, no?! (5K iMac, i7, M295X)
 
GPU memory usage is also really, really high, even after a reboot (from iStat Menus).
 

Attachments

  • Screen Shot 2015-07-02 at 3.14.46 PM.png
    Screen Shot 2015-07-02 at 3.14.46 PM.png
    163.5 KB · Views: 273
OMG this thread has me so confused. :eek:

I've got 2 questions.

1. Do I need the 295 ?

I will never Game. I'm a musician and I'll use the IMac as DAW ( Digital recording with logic or cubase ) and light rookie type musician videos. I video tape our gigs on a Zoom Q3 and I edit them up.
I plan on getting the i7 with 500 SSD and 8GB of RAM and I'll upgrade the RAM on my own.

I can't see need the 295 for a screen that just sits there when recording. Also don't know if the 295 is gonna be helpful with my type of Video editing.


2. If I get the 295 I imagine it will almost never get hot and the fan won't come on much for what I do ?


Thanks in advance. Biggest computer purchase I'll ever make OMG :cool:
 
OMG this thread has me so confused. :eek:

I've got 2 questions.

1. Do I need the 295 ?

I will never Game. I'm a musician and I'll use the IMac as DAW ( Digital recording with logic or cubase ) and light rookie type musician videos. I video tape our gigs on a Zoom Q3 and I edit them up.
I plan on getting the i7 with 500 SSD and 8GB of RAM and I'll upgrade the RAM on my own.

I can't see need the 295 for a screen that just sits there when recording. Also don't know if the 295 is gonna be helpful with my type of Video editing.


2. If I get the 295 I imagine it will almost never get hot and the fan won't come on much for what I do ?


Thanks in advance. Biggest computer purchase I'll ever make OMG :cool:

In short, no, you probably don't need the upgraded m295x card. In fact if you do, the machine will likely run slightly hotter and noisier due to the beefier graphics card generating more heat that has to be cooled.
 
In short, no, you probably don't need the upgraded m295x card. In fact if you do, the machine will likely run slightly hotter and noisier due to the beefier graphics card generating more heat that has to be cooled.

So what would be the disadvantage to having the 290 card ?


I'm totally not a gamer. I plan on doing some video editing with my IMac. Nothing major I quess. Im a musician and I video record gigs on a Zoom Q3 and I edit them up a little.

Will the 290 lock up or freeze during video edits or something bad that I would have wished to get the 295 ?
 
One question that comes to mind: do these problems continue in the next release of OSX?

Personally, I have a m290x, which I settled on after learning of heat related problems on this forum, and after figuring out that an i5/m295x configuration was unlikely to be sold at a discount.
 
So what would be the disadvantage to having the 290 card ?


I'm totally not a gamer. I plan on doing some video editing with my IMac. Nothing major I quess. Im a musician and I video record gigs on a Zoom Q3 and I edit them up a little.

Will the 290 lock up or freeze during video edits or something bad that I would have wished to get the 295 ?

For your uses, there shouldn't be any problem.

Next release of OS X - El Capitan, will be fine tuned for even better performance.

These systems (iMacs) are not user upgradable, so many people are maxing them out at purchase time to keep them as long as possible - work smoother with future software releases. It's an investment after all.
 
One question that comes to mind: do these problems continue in the next release of OSX?

Personally, I have a m290x, which I settled on after learning of heat related problems on this forum, and after figuring out that an i5/m295x configuration was unlikely to be sold at a discount.

The m295x high temperatures are related to the design - manufacturing method, so next OS X may treat hardware better in general, due to several optimizations, but when this GPU will be really pushed it will certainly go up to ~100 C temperatures.

High temps are in general bad, but it's a little bit early to know the exact effects on this model - possible failure rate.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.