Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Pushed the iMac with 680MX hard under Windows 7 at 2560x1440 to see if it still beat the 680 Classified on the Mac Pro.

Heaven with 8X AA, 16X Aniso
680MX = 33 FPS
680 Classified = 44 FPS

WoW Pandaria at 8X AA, 16X Aniso
680MX = 77 FPS
680 Classified = 96 FPS

Team Fortress 2 at 8X AA, 16X Ansio
680MX = 88 FPS
680 Classified = 77 FPS

Left 4 Dead 2 at 8X AA, 16X Aniso
680MX = 86 FPS
680 Classified = 109 FPS

At least in these examples, the 680 Classified was faster than the 680MX. The exception is TF2, but according to the amps pulled by the CPU and GPU, it appears to be CPU bound even at the high rez.
 
Pushed the iMac with 680MX hard under Windows 7 at 2560x1440 to see if it still beat the 680 Classified on the Mac Pro.

Heaven with 8X AA, 16X Aniso
680MX = 33 FPS
680 Classified = 44 FPS

WoW Pandaria at 8X AA, 16X Aniso
680MX = 77 FPS
680 Classified = 96 FPS

Team Fortress 2 at 8X AA, 16X Ansio
680MX = 88 FPS
680 Classified = 77 FPS

Left 4 Dead 2 at 8X AA, 16X Aniso
680MX = 86 FPS
680 Classified = 109 FPS

At least in these examples, the 680 Classified was faster than the 680MX. The exception is TF2, but according to the amps pulled by the CPU and GPU, it appears to be CPU bound even at the high rez.

To be expected as the 680mx is a heavily under clocked GTX 680. I still think the 680mx does extremely well for a mobile GPU, it runs Far Cry 3 at 1440p with all settings on high at 45FPS and at 1080p it can run on ultra settings at over 55FPS! :)
 
Some games

Hi, just here to give some details about the new iMac in my sig, running under Windows 8:

Playing League of Legends on the highest res, highest quality of shadows, etc always over 60fps.

Playing Dota 2 on the highest res (don't remember if I changed anything else) well above 60fps also.

I've upgraded the drivers. The one that was installed with Boot Camp was fine, but I got about 5-10% higher 3Dmark scores by upgrading the driver to this version. http://www.geforce.com/drivers/results/54629

My fans are always 100% silent. I really recommend this machine for game players who still want to buy a Mac. The only minor complaint is the lack of Fusion drive under Windows.
 
Pushed the iMac with 680MX hard under Windows 7 at 2560x1440 to see if it still beat the 680 Classified on the Mac Pro.

Heaven with 8X AA, 16X Aniso
680MX = 33 FPS
680 Classified = 44 FPS

WoW Pandaria at 8X AA, 16X Aniso
680MX = 77 FPS
680 Classified = 96 FPS

Team Fortress 2 at 8X AA, 16X Ansio
680MX = 88 FPS
680 Classified = 77 FPS

Left 4 Dead 2 at 8X AA, 16X Aniso
680MX = 86 FPS
680 Classified = 109 FPS

At least in these examples, the 680 Classified was faster than the 680MX. The exception is TF2, but according to the amps pulled by the CPU and GPU, it appears to be CPU bound even at the high rez.

Sometimes I get the impression you just do that to show how powerful your 680 is... o_O
 
I would like to say a big thank you to Barefeats and all the other Posters for their efforts benchmarking in this thread.
For those who may seem a little dissapointed that the 680mx is not pulling the same framerates as a desktop gtx680, please bear in mind the difference in clock speeds between the two.

mobile gtx 680mx= 720mhz processor, 2500mhz memory
desktop gtx 680= 1006mhz processor, 6008mhz memory

Of course this info came from my own googling. So if i have misinterpreted the figures wrong perhaps Barefeats or someone else could supply the correct figures :) .
 
WOW. What were your temps while gaming?

Both stock and OC, when running MSI and the associated benchmark, I can push 89-90C but then the fans on full -- there's no difference between stock and +250/+350.

I'm curious about the memory speeds because I wouldnt think that memory speed would cause heat ... if it's gddr5, maybe the effective memory speed is higher than 2500??
 
Both stock and OC, when running MSI and the associated benchmark, I can push 89-90C but then the fans on full -- there's no difference between stock and +250/+350.

I'm curious about the memory speeds because I wouldnt think that memory speed would cause heat ... if it's gddr5, maybe the effective memory speed is higher than 2500??


I use EVGA Software Utilities for overclock the GTX 680 MX.

Temperature increases from 86° to 89°. Fan on full.

My framerate goes from 45/50 to 50/60 in Far Cry 3 (first level, top of Radio Tower) with 1440p, Very Hight, no AA and No Vsync.
 
mobile gtx 680mx= 720mhz processor, 5000mhz memory
desktop gtx 680= 1006mhz processor, 6008mhz memory
fixed
http://www.3dmark.com/compare/3dm11/5310953/3dm11/3031520
here's the link to OC'ed 680MX vs desktop 680 comparison. As you can see, 680's mem clock is shown as 3004MHz while its actual effective clock is 6008MHz. The same thing as with 680MX. If 680MX had only 2500MHz mem clock, it had something about P4000, which is pretty awful for such GPU =)
 
Last edited:
Just installed windows on my 3TB fusion and did a 3D mark benchmark

27" with 680mx (w/ stock Apple bootcamp drivers)..........6342

I had heard from some that bootcamp was incompatible with the 3TB Fusion drive. But it looks like the issue has been resolved. Did you have to do anything special to get it working?
 
Sometimes I get the impression you just do that to show how powerful your 680 is... o_O

Touché. I became a little obsessed when the earlier tests showed the 680MX beating the 680 Classified (as well as the 580 Classified I tested). That flew in the face of logic when you look at the specs.

Turned out that at lesser settings, the games are CPU bound. Since games use 2 cores at the most, the iMac has the advantage over the Mac Pro we were using -- especially if TurboBoost kicks in.

I didn't want hard-core gamers rushing to sell their "hopped up" Mac Pro and replacing it with an iMac until they had a full picture of the performance potential.

Overall, the top-end iMac is impressive and a definitely worth serious consideration by anyone considering a replacement to their older Mac Pro or iMac.
 
I had heard from some that bootcamp was incompatible with the 3TB Fusion drive. But it looks like the issue has been resolved. Did you have to do anything special to get it working?

Yeah it requires some command line work in terminal but nothing too hard, im hopeless at using terminal and i was able to do it. A few other people have tried but they are having issues....
 
Does anyone know if GPU-Z identifies the 680mx correctly? (i'm away on holidays) I don't understand the 680mx having a memory speed of 2500mhz GDDR5... seems odd.. why would this be a heat issue on a mobile processor? Wouldn't the GPU speed be more an issue?

You need only look at nvidia's wiki posted numbers and the website and compare the desktop gtx memory bandwidth vs the 680mx memory bandwidth to see someone doesn't add up:

http://en.wikipedia.org/wiki/GeForce_600_Series#Chipset_table

128.8 texture fill rate
6000 mhz memory
GDDR5
256-bit Memory Interface Width
192.3 Memory Bandwidth (GB/sec)

http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx/specifications

92.2 texture fill rate:
2500 MHzMemory Clock
GDDR5 Memory Interface
256-bit Memory Interface Width
160 Memory Bandwidth (GB/sec)


And so yes, an overclock of +250mhz on the gpu and +350mhz on the memory is pretty significant for the 680mx!
 
Last edited:
Barefeats, is there a possibility to include the 675MX in your comparison tests as well? There's very little tests/benchmarks to be found on these. As I'm a very casual gamer I'm not sure on the 150 euros/dollar upgrade.
 
Does anyone know if GPU-Z identifies the 680mx correctly? (i'm away on holidays) I don't understand the 680mx having a memory speed of 2500mhz GDDR5... seems odd.. why would this be a heat issue on a mobile processor? Wouldn't the GPU speed be more an issue?

You need only look at nvidia's wiki posted numbers and the website and compare the desktop gtx memory bandwidth vs the 680mx memory bandwidth to see someone doesn't add up:

http://en.wikipedia.org/wiki/GeForce_600_Series#Chipset_table

128.8 texture fill rate
6000 mhz memory
GDDR5
256-bit Memory Interface Width
192.3 Memory Bandwidth (GB/sec)

http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx/specifications

92.2 texture fill rate:
2500 MHzMemory Clock
GDDR5 Memory Interface
256-bit Memory Interface Width
160 Memory Bandwidth (GB/sec)


And so yes, an overclock of +250mhz on the gpu and +350mhz on the memory is pretty significant for the 680mx!

Found this interesting article on how memory speeds are calculated.

http://www.geeks3d.com/20100613/tut...-clock-real-and-effective-speeds-demystified/

It would appear nvidia are showing effective memory speed for the gtx680, But real memory speed for the 680mx ??.

Probably more to do with marketing than speed if that's the case.
 
fixed
http://www.3dmark.com/compare/3dm11/5310953/3dm11/3031520
here's the link to OC'ed 680MX vs desktop 680 comparison. As you can see, 680's mem clock is shown as 3004MHz while its actual effective clock is 6008MHz. The same thing as with 680MX. If 680MX had only 2500MHz mem clock, it had something about P4000, which is pretty awful for such GPU =)

My apologies to Kaellar, he was saying the exact same thing in his post, It just didn't click with me till i read the article i linked.
 
Barefeats, is there a possibility to include the 675MX in your comparison tests as well? There's very little tests/benchmarks to be found on these. As I'm a very casual gamer I'm not sure on the 150 euros/dollar upgrade.

Love to but I don't have access to an iMac with 675MX unless I buy one. Maybe someone on this forum will have an iMac with the 675MX and post some result to the free GPU benchmarks I listed like Heaven (DirectX/OpenGL) and LuxMark (OpenCL). If they have some games or pro apps that stress the GPU, I can give them my procedures for testing the way I did.
 
I have the 27" Imac with GTX 680MX, 1TB Fusion Drive, Intel Core i5-3470
I use windows 7 Ultimate 64bit and the 310.70 Nvidia drivers.

In 3Dmark11 i get 6456 graphics score while the other Imacs I've seen get like 6736.

Anyone have any idea why I get a lower score?
 
I have the 27" Imac with GTX 680MX, 1TB Fusion Drive, Intel Core i5-3470
I use windows 7 Ultimate 64bit and the 310.70 Nvidia drivers.

In 3Dmark11 i get 6456 graphics score while the other Imacs I've seen get like 6736.

Anyone have any idea why I get a lower score?

I found the score can vary, i got 6400 one time then 6750 the next, im not sure why it does that.
Here is my best score for reference
 
iMac Score w/OC

I bought the new iMac (i7 3,4GHz, 680MX, 32GB Ram, 768 SSD) and installed Win8 Pro x64, Driver 310.70 WHQL.
Without OC I got P6404.
With OC (EVGA Tool, +185MHz +185MHz) I got P7539. (ID: 5423204). The Card ran at 80°C max.
 
Overclocking Questions

Hey guys, I've been keeping an eye on this thread for weeks, still awaiting the delivery of my iMac... But thanks to all your posts I can't wait till it arrives!

Anyway, I had a couple of questions. 1. Will overclocking likely cause damage to an iMac? 2. If an iMac 'died' while being overclocked would it still be covered by Applecare? (Or, would Apple be able to tell you had overclocked it?)

Finally, if anyone can be bothered, could someone briefly explain the points I should know about overclocking before I do it? Like what temperature is dangerous for a GPU? And what +###/+### I should go for when I get my iMac.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.