Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Update: It seems to have brought the temp down a little, with +250/+375 I now get 85C. Maybe it also has something to do with the settings you choose? I have everything on max apart from Shadows which are on low (seemed to be the biggest FPS increase with the least visual effect) + Shading & Post-processing which re both on High.

Use MSI Afterburner's OSD function, and keep an eye on GPU temperature during gameplay. Your general temperature might be slightly lower, and the 85C might just be an occational spike
Btw., in Crysis 3 I prefer very high textures, no AA. Graphics options at high, as there is very little visual difference from high to very high IMO (except shader quality which kills the framerate).

As far as temperature, does your iMac have an hard drive/fusion drive? Just guessing here, but hard drives produce quite a bit of heat, and it's likely that they could increase the overall temperature in a closed system like the iMac. Another reason to get SSDs.. :p
 
Last edited:
For overclocking install MSI Afterburner. Tolerances will differ between different cards units of the same model, but so far everything up to +250 (GPU)/+375 (RAM) seems safe and has only a slight increase in temperature. +150/+250, +200/+300 are also good overclock settings. (MSI Kombustor can be used to check overclock stability.) Don't go for the highest overclock first, try out the lowest setting first, *then* go higher if successful.

What to do:
1: Install a program like HWiNFO64 to monitor temperatures, esp. CPU and GPU temperatures (and HD temperature). I prefer to use MSI Afterburner OSD (on screen display) function to have framerate and GPU temp in a top corner of my screen when playing games. This way I can keep an eye on the GPU temp, and see the effect of the overclocking - and it can make a BIG difference :).
2: If you want lower GPU and CPU temperatures (and possibly a higher stable overclock), disable CPU turbo boost in Windows 7. This is done in Control Panel: Advanced Power Options: CPU Max Speed 99%. (Don't change the minimum CPU speed setting.) I recommend this, since otherwise temperatures will get very high unless you use vsync (dual buffered). In my experience +285/+425 here works very well, but I would probably go for 250/375 in the newest games like Far Cry 3 and Crysis 3 etc. Obviously some games don't need overclocking, as they are not too demanding. Game performance with CPU turbo boost off should in most cases be near identical.
3: Use Lobbo's Fan Controller, and set a static fan speed. I prefer 2400-2500rpm for demanding games, and 1700-1800rpm for older or less demanding games. Don't go higher than 2600rpm!
4: If there are indications your iMac is straining (weird noises), or if it becomes unstable (random reboots, weird graphical glitches) lower the overclock setting. The first time you open Lubbo's make sure you lower the max fan speed setting!
5: Download RivaTuner, and extract the D3DOverrider (inside the Tools folder of the RivaTuner installation file) with a zip program (7-Zip works well), and install D3DOverrider. This way you can have triple buffered vsync in directX games. The great thing with triple buffered vsync is that you get no tearing, but also good framerates and less lagging than "normal" vsync.


What not to do:
1: Don't let the GPU heat up to much higher than 82-83C. There are differing opinions of temperature tolerance levels in this forum, but I prefer to be on the safe side. Some will find even higher temps acceptable. (With CPU turbo boost off and using Lubbo's the CPU temp usually stay in the 70s (or lower). I can get 80-82C here, if I play Crysis 3 for several hours with +250/+375.)
2: Don't mess with voltage settings.
3: Don't let the HD heat up too much, as HDs are more vulnerable to high temperatures. I've read that 60C is not good for the HD, and the temp should be well below that. I only use SSDs, so I'm not very knowledgable about HD temperature tolerances. Maybe others can shed more light on this aspect.


NB: Overclocking is done at your own risk. In most cases it should work just fine, but you never know... I've never had any issues, but I don't mess with voltage settings and I avoid very high temperatures. Enjoy! ;) If you have any additional questions, ask away! PS: All the programs I mentioned are for Windows 7/BootCamp. It makes no sense to overclock in OSX anyway, as games in Windows 7 perform much better.

I'm confused about a few things. Triple buffering + vsync is available in the Nvidia control panel, so why do we need to use Riva Tuner?

Second, when I turned vsync and triple buffering in Riva Tuner, I still had visible screen tearing in Borderlands 2, for example. This was with triple buffering + vsync set to off in the nvidia control panel.

I've found the best settings for Borderlands 2 performance is just to use vsync set to on in the nvidia control panel, and to not use Riva Tuner.
Then, just set the game to 1080p and turn most everything on except for FXAA and PhysX. That keeps the framerate at 60fps, no tearing.

Any tips, please? :)
 
Last edited:
I'm confused about a few things. Triple buffering + vsync is available in the Nvidia control panel, so why do we need to use Riva Tuner?

Second, when I turned vsync and triple buffering in Riva Tuner, I still had visible screen tearing in Borderlands 2, for example. This was with triple buffering + vsync set to off in the nvidia control panel.

I've found the best settings for Borderlands 2 performance is just to use vsync set to on in the nvidia control panel, and to not use Riva Tuner.
Then, just set the game to 1080p and turn most everything on except for FXAA and PhysX. That keeps the framerate at 60fps, no tearing.

Any tips, please? :)

Well, to unconfuse you... ;) For some reason triple buffering is badly supported by Nvidia, and the panel function only applies to openGL games. A huge omission by Nvidia IMO. They added adaptive vsync though, which is basically vsync turning itself on when game runs at 60fps or higher, and vsync turns itself off automatically when game runs at below 60 fps (which gives you tearing again). Now, directX games are much more common, therefore the triple buffering function in the Nvidia panel is of limited value. D3DOverrider adds triple buffering to direct3D/directX games. A (very) few directX games wont work with D3DOverrider enabled. Metro 2033 is the only one I've seen so far (the game wont even start properly). All my other games work: Crysis 1-3, Half-Life 2 (doesn't really need it), Far Cry 3, all the STALKER games, Alan Wake, Aliens Colonial Marines (yuk) etc..etc.. Don't have Bordelands 2, so can't tell you what's going on there.. The only thing I can think off is that you should turn off vsync in all directX games with D3DOverrider enabled.

I tried to use the 1080p resolution, but 1440p just looks better. :) Of course in some cases the higher framerate is preferable. I just play single player, no multiplayer online...
 
Last edited:
Well, to unconfuse you... ;) For some reason triple buffering is badly supported by Nvidia, and the panel function only applies to openGL games. A huge omission by Nvidia IMO. They added adaptive vsync though, which is basically vsync turning itself on when game runs at 60fps or higher, and vsync turns itself off automatically when game runs at below 60 fps (which gives you tearing again). Now, directX games are much more common, therefore the triple buffering function in the Nvidia panel is of limited value. D3DOverrider adds triple buffering to direct3D/directX games. A (very) few directX games wont work with D3DOverrider enabled. Metro 2033 is the only one I've seen so far (the game wont even start properly). All my other games work: Crysis 1-3, Half-Life 2 (doesn't really need it), Far Cry 3, all the STALKER games, Alan Wake, Aliens Colonial Marines (yuk) etc..etc.. Don't have Bordelands 2, so can't tell you what's going on there.. The only thing I can think off is that you should turn off vsync in all directX games with D3DOverrider enabled.

I tried to use the 1080p resolution, but 1440p just looks better. :) Of course in some cases the higher framerate is preferable. I just play single player, no multiplayer online...

Thanks for some unconfusion. :)

I still tend to have a preference for games to run at a smooth, consistent frame-rate at a less-than-native resolution, so for me it seems just forcing vsync on either in game or "master" it from the nvidia control panel works best. I can't figure out why Borderlands 2 doesn't work with D3DOverrider, and it seems odd that this, - the only game I've tested, - doesn't work properly, but I'll let someone else chime in there...

Cheers for the update on what nvidia is doing. And yes, it does seem odd that they haven't fully supported this in DirectX.
 
Some SimCity benchmarks please ;)
It would be so badass if that game would run smooth on ultra in 2560 x 1440.

I can run SimCity maxed out (2560x1440) and still use the in game video capture without any problem.
In terms of smoothness, you can get 55fps without overclock with everything maxed out (2560x1440), with overclock it's a solid 60 (including v-syn).
 
My 675mx i5 gets up to 89C-95C very quickly during Heaven and Valley benchmarks...no overclocking...

can anybody explain why and if that temp is safe on this card? im seeing much lower temps disgussed here as too high on the 680mx...
 
My 675mx i5 gets up to 89C-95C very quickly during Heaven and Valley benchmarks...no overclocking...

can anybody explain why and if that temp is safe on this card? im seeing much lower temps disgussed here as too high on the 680mx...

Are your temps safe? For the GPU - yes. For other internals it neighbours with - not sure. The reason is hard to be named for sure. Lower quality thermal compound or simply a bit faulty assembly could be the reasons.
 
Are your temps safe? For the GPU - yes. For other internals it neighbours with - not sure. The reason is hard to be named for sure. Lower quality thermal compound or simply a bit faulty assembly could be the reasons.

I guess thats what 3 years Apple Care if for right?... :)
 
I can run SimCity maxed out (2560x1440) and still use the in game video capture without any problem.
In terms of smoothness, you can get 55fps without overclock with everything maxed out (2560x1440), with overclock it's a solid 60 (including v-syn).
Sweet! Thanks for sharing these awesome results :)
 
Are your temps safe? For the GPU - yes. For other internals it neighbours with - not sure. The reason is hard to be named for sure. Lower quality thermal compound or simply a bit faulty assembly could be the reasons.

Just did the Valley benchmark again and chip temp starts at 45C and I cant even make it halfway through the test without it hitting 89C. I have to quit the benchmark or it will easily get over 90C and at that temp i hear no fan at all...
 
I can run SimCity maxed out (2560x1440) and still use the in game video capture without any problem.
In terms of smoothness, you can get 55fps without overclock with everything maxed out (2560x1440), with overclock it's a solid 60 (including v-syn).

Erm, I'm not getting anywhere near 55fps with everything maxed, so not sure how you're getting those results?

Anyway, if you max everything, the best way to play this game with the i7+680MX is to set the game to run at a max of 30fps. This smooths things out greatly as Sim City has a VERY variable frame-rate depending on the density of areas of the city, and this is a game that seriously doesn't need 60fps at all times. 30fps is perfect, and the game runs at that with EVERYTHING set to Ultra/High, FXAA on, Shadows on max etc at 2560x1440.

Very awesome game, especially since it... ahem... works now.
 
There's no specific benchmark here but I'm running Windows 8 on my 27" iMac via Boot Camp and I can play BF3 multiplayer at 2560x1440 with everything on Ultra, 4xAA and 16xAF and I typically get 45-50fps. It sometimes drops down to the mid 30s when things get particularly intense but generally it plays great and looks fantastic at native res. I'm very impressed!
 
There's no specific benchmark here but I'm running Windows 8 on my 27" iMac via Boot Camp and I can play BF3 multiplayer at 2560x1440 with everything on Ultra, 4xAA and 16xAF and I typically get 45-50fps. It sometimes drops down to the mid 30s when things get particularly intense but generally it plays great and looks fantastic at native res. I'm very impressed!

It's clear that people have different tolerances when it comes to frame rates. I turned quite a few settings down in BF3 to get a consistent 50-60fps. The idea of 45-50fps dropping to the mid-30s is simply not acceptable to me.
 
So, two things:

1. There is a new driver out - http://www.nvidia.co.uk/object/notebook-win8-win7-64bit-314.22-whql-driver-uk.html

2. Bioshock Infinite plays really nicely and looks fantastic. To be honest it's a breath of fresh air from all of the Crysis (1, 2 & 3) + BF3 that I have been playing recently.

With that I am going to bed... My iMac is ruining my sleep!.. But it's so worth it.

Can't speak for Bioshock Infinite, but I've tried every single one of these BETA drivers (including this new 314.22 today), and Tomb Raider never gets any better with TressFX on. Up to 60% better performance? No chance (with our systems).
 
Can't speak for Bioshock Infinite, but I've tried every single one of these BETA drivers (including this new 314.22 today), and Tomb Raider never gets any better with TressFX on. Up to 60% better performance? No chance (with our systems).

314.22 isn't a beta driver, and the up to 60% increase is for the desktop 680 with the difference being from the last official driver release, 314.07, not the last beta driver. I'm pretty sure I was getting an additional 15-20FPS or something after the last beta driver when using TressFX, but it of course wasn't running at 60FPS, more like 40-45FPS. I can't really remember the specifics, but the drivers released over the last few weeks have definitively given a performance boost, as they usually do!
 
314.22 isn't a beta driver, and the up to 60% increase is for the desktop 680 with the difference being from the last official driver release, 314.07, not the last beta driver. I'm pretty sure I was getting an additional 15-20FPS or something after the last beta driver when using TressFX, but it of course wasn't running at 60FPS, more like 40-45FPS. I can't really remember the specifics, but the drivers released over the last few weeks have definitively given a performance boost, as they usually do!

Sorry, I meant I've tried all the BETA drivers including and also the new WHQL. Maybe I wasn't paying attention, but I was using the earlier 314.xx drivers, and I definitely didn't get 15-20fps more with newer 314.xx drivers with the Tomb Raider patch. At least not according to the in-game Tomb Raider benchmark.
 
Last edited:
Bioshock looks VERY pretty with the 680MX. I'm a total ambient occlusion junkie, so this game fills that need quite well. :D

I have it running at 1920x1080 with pretty much everything maxed at 60fps. I like vsync on, so 2560x1440 had a few drops to 30fps which bothered me. Very pleased with the performance.
 
Last edited:
Just got mine setup, and I'll be doing some benchmarks with a few games; Diablo III and Starcraft 2 to begin with. These will all be Mac native.
 
found this on passmark software site
 

Attachments

  • video.PNG
    video.PNG
    54 KB · Views: 331
Tomb Raider iMac 27'' GTX 680M

Hi guys, I recorded some gameplay of Tomb Raider on my iMac.
Specs - 27'' GTX680M - 24GB RAM - Intel i7 3.40
http://www.youtube.com/watch?v=WxH3-qasU24

In the video, I played at high settings, 2560x1440p, locked at 30FPS for a better perfomance, and it was recorded at 80% quality with Bandicam (also, I set the program to record it at 1080p, while playing at 1440p).

I can play the game at max settings and at 2560x1440p without Anti-aliasing and TresFX at 45-50 FPS. TresFX drops the framerate to 25-30FPS, so it's pointless. (I'm using the lastest drivers, 314.22)

Here's a question for you,
If set at high quality, the game plays at 50FPS but the fans are on, around 75% speed. I don't like that, so I locked it at 30FPS (plays great at that framerate) and the fans were off at all.
IS IT BAD FOR THE COMPUTER IF YOU PLAY FOR A LONG TIME WITH THE FANS ON?
 
Now with new Drivers

Hi guys, I recorded some gameplay of Tomb Raider on my iMac.
Specs - 27'' GTX680M - 24GB RAM - Intel i7 3.40
http://www.youtube.com/watch?v=WxH3-qasU24

In the video, I played at high settings, 2560x1440p, locked at 30FPS for a better perfomance, and it was recorded at 80% quality with Bandicam (also, I set the program to record it at 1080p, while playing at 1440p).

I can play the game at max settings and at 2560x1440p without Anti-aliasing and TresFX at 45-50 FPS. TresFX drops the framerate to 25-30FPS, so it's pointless. (I'm using the lastest drivers, 314.22)

Here's a question for you,
If set at high quality, the game plays at 50FPS but the fans are on, around 75% speed. I don't like that, so I locked it at 30FPS (plays great at that framerate) and the fans were off at all.
IS IT BAD FOR THE COMPUTER IF YOU PLAY FOR A LONG TIME WITH THE FANS ON?



Actually, forget what I said. Apparently I was using 314.07 Drivers. Now I updated to 314.22 and there's been quite a Boost.
Playing on ULTRA, which means no TresFX, and at 1440p I got 33FPS, max. 37 using the old drivers. Now I get around 36FPS. It hits 42 sometimes.
Playing on High also 1440p, (the way I play it), i got 45-50 before and now I get 50-55 FPS, even 60FPS from time to time, but the fans are always on.
So yeah, new NVidia drivers did actually boost the performance.

Still, do you guys think having the fans working is bad for the computer? Is it better if they are off?
 
Actually, forget what I said. Apparently I was using 314.07 Drivers. Now I updated to 314.22 and there's been quite a Boost.
Playing on ULTRA, which means no TresFX, and at 1440p I got 33FPS, max. 37 using the old drivers. Now I get around 36FPS. It hits 42 sometimes.
Playing on High also 1440p, (the way I play it), i got 45-50 before and now I get 50-55 FPS, even 60FPS from time to time, but the fans are always on.
So yeah, new NVidia drivers did actually boost the performance.

Still, do you guys think having the fans working is bad for the computer? Is it better if they are off?

I don't think there's anything worrisome about the fans being on when you play.. That's what they are there for. I've been playing bioshock infinite at 2560x1440 at a mix of high/ultra settings and it looks amazing. My fans are running constantly and my temps are fine.
 
I don't think there's anything worrisome about the fans being on when you play.. That's what they are there for. I've been playing bioshock infinite at 2560x1440 at a mix of high/ultra settings and it looks amazing. My fans are running constantly and my temps are fine.

Yeah, I've been playing Bioshock Infinite on the same settings as you mentioned. I'm not sure what good temperatures are but according to MSI Afterburner my GPU sits at about 81/82 degrees under the most stress. It's the same when playing Crysis 3. It doesn't quite reach that with BF3 but it's not far off. I've not really heard any noise difference with fans. I have a couple of questions...

1) Is 82 degrees for the GPU an ok temperature? I presume so, and I imagine that it can safely go quite a bit higher, but I wanted to check what you guys are getting.

2) I heard about disabling Turbo Boost (I have the 3.4 i7) by putting the max CPU to 99% in Windows. Are many of you doing this? If so, do you think it's worth it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.