Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I personally prefer to use OCCT for pressure test / error checking purpose. I guess it can push your card to 105C in few seconds.


You obviously are using it under Bootcamp? I see it only has a Windows version, or am I looking in the wrong place? I haven't got a Bootcamp partition hence why I used the Valley Benchmark. I really just wanted to hammer the GPU for a day.
 
I have the 4.0 riMac with 32GB ram and the upgraded GPU.

The fans have basically not stopped since I started the computer. But the culprit seems to be Dropbox of all things ...

That being said, however, I still have my 2012 iMac (selling on gazelle) and it is quiet quiet quiet ...

Can someone explain what the issue might be? I would hope that if anything, the newer iMac would be quieter!
 
You obviously are using it under Bootcamp? I see it only has a Windows version, or am I looking in the wrong place? I haven't got a Bootcamp partition hence why I used the Valley Benchmark. I really just wanted to hammer the GPU for a day.

Yes, I use that under bootcamp. It's the best, but no OSX version yet.

To stress my GPU in OSX. I will run Furmark + Valley + Heaven at the same time.
 
I accomplished this with much difficulty. The key I found was to get a fully functional Windows 8.1 installation on the internal drive of the iMac, then use Winclone to make a copy of this working windows partition. Follow Winclone's instructions to migrate the installation to your external USB 3.0 drive (they key to which is to make sure you have it set up as a GUID partition, but format it as FAT32).

Now here's the kicker: I could not get Windows to even install on my internal disk (nor boot from my external USB drive) if anything was plugged into any of the USB ports or thunderbolt ports on my computer. Once I unplugged all my hubs and thunderbolt docks, I could set up windows on my internal drive and successfully boot to my external Winclone-made USB 3.0 drive, but if anything else was plugged in (other than the single USB 3.0 drive on which Windows was installed, obviously), the computer would fail to boot from the USB 3.0 drive. Try this and see if it works for you.

thanks but I decided I'll leave bootcamp on the internal SSD for now. I only have 256 GB but it's enough for both
 
Oh man...

I just purchased an M295X version.
so this thread was originally about throttling in windows, right? Sorry but it just takes to much time to read all the replies, my english is not so good.
Can somebody summarize, in what kind of situation (OSX or windows, what application, ...) this throttling happens? High temp and fan noise doesn't bother me. - Thanks to Fenn for this thread!


Edit: I simply love my new iMac and I don't hear any fan noise. (Standard Resolution) The screen is awesome!
 
Last edited:
ok well i'm not a gamer but i got the maxed out model because i'm getting into photography.

Until today I hadn't heard my fans.

Then I started running basic things in full screen, e.g., iPhoto, safari, and the worst of all, maps.


fans went full blast. even if the window is stretched manually the fans don't trigger but green buttoning it makes it run full.

so unless this is yosemite or its because i'm running it at 3200 x 1800 but as a i mentioned as a window or with the window stretched to the max (not full screen) they don't trigger

if u want to know what fans running full sound like, switch to that resolution and use the maps app on full screen.
 
While i'm no expert on this subject, i believe this to be false. I believe once you change the output resolution, the monitor will receive the signal from a videocard on that resolution, and then does the scaling itself.
So the videocard only outputs that given resolution.

Obviously, that doesn't apply to retina mode, but that's not what's being discussed here.

If you have information to the contrary, please provide your source.

Someone please correct me if I am wrong, but on my Retina 5K iMac there is no "Retina" Mode, every resolution is Retina mode, even if you hold option key while clicking on "Scaled" to see the res list there are no Retina/Non-Retina Modes. Even at default 2560x1440 the system or card I am not sure which are pushing the entire 5K pixels. When on this default mode of 2560x1440 and it "seems" the res is just that, 5K compatible apps give you access to full 5K without switching res modes. I have no idea how this is being done though...
 
You need a heat gun to melt the adhesive around the edges of the screen glass to remove it and a replacement adhesive ring to refit. Then you have to get at GPU to put a sensor probe next to it and having 2012 chassis apart I would find that very difficult to do.

I've yet to see a sensor on the GPU and CPU to seem badly wrong. Put the 5k to deep sleep, and wake it up and if the temperatures are close to ambient you know that it has correct calibration.


you don't need the hot gun, The teardown shows them using a very thin blade inserted at the side and then ran down the sides to break the glue contact. Not something I would be game to try on such a new machine, but seems possible, getting good contact with the GPU without changing the entire cooling system is another matter entirely and I doubt it.
 
Applecare will take care of any failures, and if there's any design issue you know Apple will take care of it as they have done with previous design flaws. Only time will tell if this is the case. These AMD cards run hot even on the desktop PC side.

You mean like when they took care of my 2011 MBP? You know, the radeongating ones...
 
if u want to know what fans running full sound like, switch to that resolution and use the maps app on full screen.

I've tried it, I don't have that problem. I opened several Apps on full screen like Maps, a Youtube-Video on Safari, Calendar and iTunes. My Mac is not louder than my external hdd.

(i5, M295X, 1TB Fusiondrive, 16GB RAM, latest updates, max resolution)
 
"BEAST of an overclocker"

*glances at my 7970 running at 1.7X stock clocks*

*glances back at thread*

*glances at open XCode project for a game I'm working on, specifically for retina displays*

*glances back at thread*

Yup I really don't understand how this all works, please do tell me.

EDIT - AMD Knows that the Hawaii and later series cards run hot. They're supposed to run hot and not throttle. Apple's drivers/BIOS implementation must be bonkers - their thermals are definitely good enough to keep it cool.

http://www.bit-tech.net/hardware/graphics/2013/10/28/is-the-amd-radeon-r9-290x-too-hot/1

You are trolling really hard, and don't really understand how resolution works, clearly. 1080p is 1080p, no matter what display it's outputting to. That's why running 480p on a 480p display will result in the same frame-rate as running 480p on a 4k display.

And what do you mean someone "claiming" they can overlock an iMac? Have you not been reading the threads of yesteryear where the 680MX is an absolute BEAST of an overclocker in the 2012 iMacs? And guess what? It didn't throttle while being overclocked, all the while running cooler than an M295X at stock clocks.

Please, if you're not going to be polite, don't respond any further.

Before this thread forgets this particular debate, I'd like to add that, WilliamG, if you were to play Diablo III on a 5K iMac at 2560x1440 resolution (which would be the accurate comparison to be making if you're to truly and fairly compare the R9 M295X in the 5K iMac and the 680MX in the 2012 iMac), it would look way worse. You would not naturally play Diablo III in that mode UNLESS you were benching these two video cards against each other. I'm not saying you didn't do that, WilliamG, but I'd imagine that if you did, you'd have also noted how much worse it looked on the retina display AT THAT RESOLUTION. IF you didn't do that (and not to offend, but I suspect you didn't), then spyguy10709's point would stand. Plus it would make sense as AMD's best MXM chipset (even if there's no ACTUAL MXM slot present in either machine) in 2014 should outperform NVIDIA's best MXM chipset in 2012.

As for the heat issues, let me remind you people that this is AMD we're talking about and their mobile GPUs have been notorious for causing heating issues. Case in point:

MacBook Pro (15-inch, 17-inch - Early 2011, Late 2011) - Class Action Lawsuit Pending

iMac (Mid 2011) - Repair Extension Program Active

It only became an issue in iMacs when the iMacs themselves became so stupidly thin.
 
As far as I'm aware they did take care of these, didn't they?

Nope. Law suit's going on about it. They took care of the iMacs suffering from the same issue, but it'd be way more expensive to take care of the laptops, so instead they just take the lawsuit, which is ironically the cheaper approach. I had to run out and buy a new MacBook due to GPU soldering melting. Even called Apple to complain. Got me nowhere.
 
Before this thread forgets this particular debate, I'd like to add that, WilliamG, if you were to play Diablo III on a 5K iMac at 2560x1440 resolution (which would be the accurate comparison to be making if you're to truly and fairly compare the R9 M295X in the 5K iMac and the 680MX in the 2012 iMac), it would look way worse. You would not naturally play Diablo III in that mode UNLESS you were benching these two video cards against each other. I'm not saying you didn't do that, WilliamG, but I'd imagine that if you did, you'd have also noted how much worse it looked on the retina display AT THAT RESOLUTION. IF you didn't do that (and not to offend, but I suspect you didn't), then spyguy10709's point would stand. Plus it would make sense as AMD's best MXM chipset (even if there's no ACTUAL MXM slot present in either machine) in 2014 should outperform NVIDIA's best MXM chipset in 2012.

As for the heat issues, let me remind you people that this is AMD we're talking about and their mobile GPUs have been notorious for causing heating issues. Case in point:

MacBook Pro (15-inch, 17-inch - Early 2011, Late 2011) - Class Action Lawsuit Pending

iMac (Mid 2011) - Repair Extension Program Active

It only became an issue in iMacs when the iMacs themselves became so stupidly thin.

Yes I benched the same resolution. 2560x1440 on both systems. Picture quality is worse, but not WAY worse. It's certainly acceptable. I don't see a significant gaming improvement over my 2012 iMac, and when you throw in the massively higher temps on the 295, it's clear - to me at least - that the GPU is a step backwards. I look at every other GPU in every other device Apple makes and all those devices have seen significant improvements. Not the iMac. I accept that, but I'm not happy about it.
 
Yes I benched the same resolution. 2560x1440 on both systems. Picture quality is worse, but not WAY worse. It's certainly acceptable. I don't see a significant gaming improvement over my 2012 iMac, and when you throw in the massively higher temps on the 295, it's clear - to me at least - that the GPU is a step backwards. I look at every other GPU in every other device Apple makes and all those devices have seen significant improvements. Not the iMac. I accept that, but I'm not happy about it.

Driving a 5K display is no small feat. I'd imagine that has a hand in performance issues there. As for the heat, there are a few things to consider:

-AMD has historically never been great about thermally efficient laptop/mobile graphics

-This is only starting to affect the iMac as it gets thinner and thinner (it didn't really start being an issue until the 2009-2011 body style and I can't imagine this retina iMac improves things at all)

-The M295X is, if I'm not mistaken, the most powerful laptop/mobile GPU from AMD that can be put into an iMac at this time. (Which is more to say that a weaker GPU like the M290X probably has slightly less heat generation than the M295X.
 
-The M295X is, if I'm not mistaken, the most powerful laptop/mobile GPU from AMD that can be put into an iMac at this time. (Which is more to say that a weaker GPU like the M290X probably has slightly less heat generation than the M295X.

Why Apple decided to go with the AMD 29x I have no idea. The Nvidia 980M would have been a much better choice.
 
Why Apple decided to go with the AMD 29x I have no idea. The Nvidia 980M would have been a much better choice.

I would bet that it comes down to one being able to do 5k SST at the time and the other not.

I would have to think that some of the issues with the AMD 29x can be fixed with a new set of drivers. I just don't get what is talking Apple so long. I have a case open since October and was told there should be an update in November to address some of the issues.
 
Why Apple decided to go with the AMD 29x I have no idea. The Nvidia 980M would have been a much better choice.

The 980M was not available. When you design a new computer all the component parts must be available months in advance -- in production quantities. In sample quantities they must be available before even that. Apple would have had to slip the entire product release.
 
Driving a 5K display is no small feat. I'd imagine that has a hand in performance issues there. As for the heat, there are a few things to consider:

-AMD has historically never been great about thermally efficient laptop/mobile graphics

-This is only starting to affect the iMac as it gets thinner and thinner (it didn't really start being an issue until the 2009-2011 body style and I can't imagine this retina iMac improves things at all)

-The M295X is, if I'm not mistaken, the most powerful laptop/mobile GPU from AMD that can be put into an iMac at this time. (Which is more to say that a weaker GPU like the M290X probably has slightly less heat generation than the M295X.

I bet there's very few mid-range cards out there that CAN'T drive a 5K display. Apple built a dedicated TCON for the iMac. It just happens to be an AMD card - at least that's how I read into things. It could easily have been a 970/980 Nvidia card, too. As far as Apple not having access to the 970/980 earlier during the design, - let's go back to 2012 when Apple got the absolutely stupendously good GTX 680MX. It was better than anything on the market for laptops by a HUGE margin.

Fast forward to 2014 and we have very, very similar performance from the AMD 295. That's NOT progress. We can kvetch all day and night about what could have/should have happened. I just like to see progress, and aside from the screen in the 5K iMacs, the rest is just either standing still (CPU is already available, RAM I even transplanted from my 2012 iMac! etc), or a step sideways.

I'm so used to Apple innovating, historically, that it really bothers me that they didn't make any major efforts when it comes to the GPU.

I understand that I'm in the minority (potentially), but after having my 2012 iMac running cool/quiet for 2 years, the 5K iMac is nothing special aside from its screen (admittedly a big deal - which is why I "upgraded").

At this point, we can argue all day about this and that, but the FACTS remain:

1.) The 2014 5K iMac runs quite a bit hotter under load than the previous 2013/2012 iMacs. At idle or general web-browsing, they run pretty cool, still.

2.) The fan ramps up almost instantly when you play any games. Diablo 3, for example, on my i7 2012 iMac with GTX 680MX would never ramp my fans up above a whisper in OS X, whereas on the 5K iMac it's almost instant fan noise as the GPU (295) heats up to ~105F. I can't speak for longevity, but this really is not ideal, obviously.

I still love my 5K iMac, and aside from some Yosemite bugs (like getting disconnected from WiFi randomly, which my 2012 Mac mini is now also experiencing since upgrading to Yosemite) the hardware of the system has been running flawlessly.

I suspect the 2015/2016 iMac refresh will be a significant GPU upgrade. Quite frankly it NEEDS to be. The 295 is
POOR. Yes, - POOR. This is NOT an-almost 2015 GPU. In 2012, Apple put a 2013 GPU in their iMacs with the GTX 680MX. I still think they used some sort of sorcery to make that happen with such low heat output in a dramatic redesign of the iMac chassis. In 2014, the M295X is just not good enough. Not. Good. Enough.
 
The 295 is
POOR. Yes, - POOR. This is NOT an-almost 2015 GPU. In 2012, Apple put a 2013 GPU in their iMacs with the GTX 680MX. I still think they used some sort of sorcery to make that happen with such low heat output in a dramatic redesign of the iMac chassis. In 2014, the M295X is just not good enough. Not. Good. Enough.

We can only speculate why Apple did not take the 980M (which is undoubtedly a superior card), but I think in the end it will boil down to very mundane reasons like yields and economy. If Nvidia is not able to produce the 980M in the volumes required by Apple, then M295X is good enough. After all, most consumers would rather have a product that they can actually buy. In the end, the GPU performance is not really up to Apple. They need to pick whatever the GPU companies are producing that will satisfy Apple's constraints.
 
We can only speculate why Apple did not take the 980M (which is undoubtedly a superior card), but I think in the end it will boil down to very mundane reasons like yields and economy. If Nvidia is not able to produce the 980M in the volumes required by Apple, then M295X is good enough. After all, most consumers would rather have a product that they can actually buy. In the end, the GPU performance is not really up to Apple. They need to pick whatever the GPU companies are producing that will satisfy Apple's constraints.

Absolutely the GPU performance is up to Apple. They've proven they can ask Nvidia for custom parts as they did in 2012. They asked, received, and the results spoke for themselves.
 
Absolutely the GPU performance is up to Apple. They've proven they can ask Nvidia for custom parts as they did in 2012. They asked, received, and the results spoke for themselves.

But what if they asked and Nvidia could not provide it this time? The deficit of GM204 chips has been widely reported, so Nvidia has to have at least some manufacturing problems with the new Maxwell. I guess that Apple did not want to wait for another 6-9 month until these problems are resolved, so they picked the second-best choice. BTW, the M295X is also de-facto a custom part and AFAIk, the chip is currently exclusively available to Apple.

Again, a state-of-the art GPU that is practically available is arguably a better choice then a 25% faster GPU that is only available on paper.
 
But what if they asked and Nvidia could not provide it this time? The deficit of GM204 chips has been widely reported, so Nvidia has to have at least some manufacturing problems with the new Maxwell. I guess that Apple did not want to wait for another 6-9 month until these problems are resolved, so they picked the second-best choice. BTW, the M295X is also de-facto a custom part and AFAIk, the chip is currently exclusively available to Apple.

Again, a state-of-the art GPU that is practically available is arguably a better choice then a 25% faster GPU that is only available on paper.

Absolutely, but I'm pretty sure, like many people, - we'd have been happy to wait. The GPU ended up being a disappointment.
 
I guess that Apple did not want to wait for another 6-9 month until these problems are resolved

That may be exactly what Apple want, so that they can make another iMac 6-9 months later with the Nvidia card and sell it to the people want this "upgrade" again.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.