Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Tesseract

macrumors regular
Original poster
Mar 2, 2008
139
38
After having scoured the internet for just such a benchmark, gauging the iMac's ability to run the most intensive graphics out there, I had little to no success. So I decided to run my own tests. I'll just leave them here in case anybody is interested.

Late '07 model aluminum 20" iMac
ATI Mobility Radeon 2600 XT (or the HD 2600 Pro, as Apple calls it)
ATI Catalyst 8.9 Drivers (September '08)
4gigs ram
64-bit Vista Home Premium Edition
Crysis 1.21

Now, as you may already know, the ATI Mobility Radeon 2600 XT that Apple put in the iMac is underclocked (meaning Apple tweaked it to be slower than a normal Mobility 2600 XT - presumably for temperature reasons).

Here is a comparison:
iMac Mobility 2600 XT
Memory: Roughly 700 mhz
Core: Roughly 600 mhz

Stock ATI Mobility 2600 XT
Memory: Roughly 750 mhz
Core: Roughly 700 mhz

I used "AMDGPUClockTool" to modify these settings and overclock the card to its non-iMac settings. Note: overclock your card at your own risk. Severely overclocking your GPU can pretty much kill your computer. I did it at very safe levels, considering most manufacturers slightly underclock their GPUs anyways, just to be on the extra extra extra safe side. Apple just took this to a whole new level. Anyways, on to the benchmark. I used CrysisBenchmarkTool1.05, running the GPU test map (first level of Crysis) 3 times on each setting. These are the results in DX10 @ 1680x1050 resolution (the max resolution for the 20" iMac) Vsync off:

iMac stock/default GPU clock speed: 600 core / 700 mem
Run #1- DX10 1680x1050 AA=No AA, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 4.055
Run #2- DX10 1680x1050 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 6.36
Run #3- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Medium ~~ Overall Average FPS: 14.525
Run #4- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Low ~~ Overall Average FPS: 29.575

Overclocked to ATI standards: 700 core / 750 mem
Run #1- DX10 1680x1050 AA=No AA, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 4.705
Run #2- DX10 1680x1050 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 8.545
Run #3- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Medium ~~ Overall Average FPS: 17.37
Run #4- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Low ~~ Overall Average FPS: 35.39

So, as you can see, there is roughly a 20% performance increase (according to the Crysis benchmarks, anyways) between Apple's underclocked settings, and the GPU's actual default settings. This is quite impressive, considering it was run at a relatively high resolution in DX10 mode. You will get a large performance boost out of Crysis by lowering the resolution to a less intense setting (say 1280x800). Most performance junkies will also customize and tweak the graphics settings resulting in a better looking game without sacrificing too much FPS. I also ran these tests in DX10 mode which uses more advanced shaders (i.e. looks a tiny bit nicer), but drops the fps a bit. If you're running the game in DX9 mode (XP or Vista 32-bit, which most people will have), you will get an extra 2-5 fps increase over DX10. So yeah, most people will want to add 5 fps or so to the above results.

Also, it is probably safe (I use the word "safe" very loosely) to overclock the Mobility 2600 XT even further, past the ATI standard speeds - as that would be actual overclocking, instead of just resetting it to ATI factory speeds.

EDIT (10/10/08):

Here are some quick benchmark results of the Mobility 2600 XT slightly overclocked in DX9.

750 core mhz / 875 memory mhz.
Run #1- DX9 1680x1050 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 9.835
Run #2- DX9 1680x1050 AA=No AA, 64 bit test, Quality: Medium ~~ Overall Average FPS: 20.25
Run #3- DX9 1680x1050 AA=No AA, 64 bit test, Quality: Low ~~ Overall Average FPS: 40.79

Run #4- DX9 1280x960 AA=No AA, 64 bit test, Quality: Custom ~~ Overall Average FPS: 22.58

There is no default Very High setting in DX9 mode. The Custom quality setting is what I usually play the game in, which is everything set to high except shaders and shadows. I run it at a lower resolution (1280x800 - the benchmark tool only has a 1280x960 setting) to get a better boost out of it.

Screenshots:
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/a-DX9Low.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/b-DX10Low.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/c-DX9Med.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/d-DX10Med.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/e-DX9High.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/f-DX10High.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/g-DX10VeryHigh.jpg
http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/h-DX9Custom.jpg

The last screenshot is my personal graphics settings (1280x800 all settings to high except shaders/shadows - plus a few advanced tweaks). Its the most optimal blend I've yet found between FPS and quality. In hindsight, I probably should have taken the screenshots from a better perspective.
As you can see, DX10 adds very little (if any) noticeable quality improvements. It does improve motion blur and depth of field effects a bit, as well as water reflections and shadows, but all of these are extremely hard to notice on anything save Very High (which is DX10 exclusive anyways). The 2600 in the iMac seems to handle DX10 mode remarkably well compared to other GPUs, though, with only slight FPS drops.
 
20% sounds good on paper, but looking at the scores it looks rather nominal. It's a fair test, but I've become jaded on using Crysis to judge anything.. Crytek just went WAY overboard on that engine.

I bet the numbers would look much more impressive if you tested, say UT 3, or whatever the current quake incarnation is.
 
On the contrary, I thought 20% was fairly low-key. But that 5 FPS increase can make all the difference in-game. And I wasn't doing the benchmarking to show off the iMac's graphics capability (that was just a side-effect), nor to compare Crysis to any other game or graphics-intensive program, but to show the difference between Apple's underclocking vs true potential of the Mobility 2600 XT. If you were to further overclock the 2600 within relative "safe" levels, you'd likely see up to a 40-50% increase.

(WARNING: do not overclock your GPU unless you know what you're doing)

But it was surprising to discover that the most overboard graphics game out there is certainly playable even on a low range year old iMac. The optimal setting I've found for it is at 1280x800 with everything set to high. Keeps its FPS at around 30, and when there's a lot of action, it maintains a comfortable 20.

Other graphics-intensive programs will likely have varied results, so this isn't really a benchmark of the iMac as a whole. It just is what it is.
 
Great post.

I ran into a problem the other day though with the latest September 8.8 drivers (I don't think we're up to 8.9 yet) in that it locked out any resolutions above 1280 for me (I play at 1480).

Did you run into this problem? I had to rollback to June's 8.7 drivers. I'm on WinXP SP2 and have a 20" 2600.

Regarding the overclocking, have you tried running it on a loop over 12 hours or so to verify it's ok with the extra heat? Are you on the 24 or 20" iMac?
 
Great post.

I ran into a problem the other day though with the latest September 8.8 drivers (I don't think we're up to 8.9 yet) in that it locked out any resolutions above 1280 for me (I play at 1480).

Did you run into this problem? I had to rollback to June's 8.7 drivers. I'm on WinXP SP2 and have a 20" 2600.

Regarding the overclocking, have you tried running it on a loop over 12 hours or so to verify it's ok with the extra heat? Are you on the 24 or 20" iMac?

The current ATI Catalyst drivers are 8.9 - just checked their website.

That's a strange problem. I haven't personally seen it before. I did have some problems a few months back with updating to new drivers - just refused to update or install. I have since been using MobilityModder - do you use that? You might want to look into it. Its very useful if you want all your apps to recognize that your GPU is a Mobility 2600 XT. You'll get a very slight performance boost out of it too. But mainly, it just makes installing new drivers really easy. http://www.driverheaven.net/modtool.php

The instructions might seem a bit daunting, but its actually quite a simple program. Maybe that will fix your problem?
 
now we just need to see detailed benchmarks for the newest 3.06Ghz 24in iMac with the nVidia 8800 GS :)


Also, do you think it would run better in an XP enviornment, as opposed to Vista?
 
now we just need to see detailed benchmarks for the newest 3.06Ghz 24in iMac with the nVidia 8800 GS :)


Also, do you think it would run better in an XP environment, as opposed to Vista?

I was actually wondering the same things. From everything I've heard, there's not really that big of a performance decrease from XP to Vista SP1 if you have enough ram. I've even heard some of the newer stuff runs -better- in Vista. But as far as my own testing, I have no clue.

Apparently, Apple pulled the same stunt with the 8800 GS as they did with the 2600 XT. I guess its a slightly more powerful mobility version of the GPU which is also underclocked. So it would be interesting to run the same kinds of tests with the iMac default speeds vs. the "stock" nVidia speeds.

And AlexisV - I forgot to reply to the last part of your post about overheating: no, I haven't really tested long-term temperature effects of overclocking (or de-underclocking in this case). I did have Crysis running those benchmarks for a solid 3 hours though, and the iMac seemed fine. Didn't get any hotter than it usually does, and the iMac didn't suddenly do an emergency shut off like it sometimes does if you're pushing it too hard for too long. I assume, since mobility GPUs are mostly just underclocked versions of a more powerful desktop card anyways (not to mention Apple's further underclocking), that de-underclocking it will have little ill-effect. Although maybe Apple had good reason to underclock it, but I doubt it, since their previous history shows them just doing this for aesthetics, or just to be overly/needlessly safe. Again, that's just my opinion. There's still an element of risk when you're messing with your GPU's clock speed.
 
Okay so I have looked around macrumors and done some searching around. I haven't found any bechmarkets for the 3.06 per se, but somone had mentioned that his 3.06 setup with only 2GB or RAM ran crysis at native (1920x1200), with settings on high, AA turned off, and it ran "smooth." He also said that at 1600x1280, it ran smooth with settings at high and 4x AA. It seems that the AA in crysis really kills the performance, and apparantly, from what i have read, its difficlt to notice much difference. At the very least, you could run crysis at 30fps at least, for sure, at high settings and a slightly lower resolution.
 
Can you post some screenshots showing the difference between the quality settings... No benchmark is complete without pics !
 
Okay so I have looked around macrumors and done some searching around. I haven't found any bechmarkets for the 3.06 per se, but somone had mentioned that his 3.06 setup with only 2GB or RAM ran crysis at native (1920x1200), with settings on high, AA turned off, and it ran "smooth." He also said that at 1600x1280, it ran smooth with settings at high and 4x AA. It seems that the AA in crysis really kills the performance, and apparantly, from what i have read, its difficlt to notice much difference. At the very least, you could run crysis at 30fps at least, for sure, at high settings and a slightly lower resolution.

Thanks for the info - that's better than I would have guessed the 8800 GS to perform. As far as AA goes, it is arguably the single biggest performance hog with regards to settings, with shaders in 2nd place. The quality increase with AA is minimal, as all it does is remove some jagged edges or pixels from objects (in layman's terms). In my own opinion, AA even makes some things look worse. Some people can't live without AA though. Most people can't see a difference. I always leave AA off.

Can you post some screenshots showing the difference between the quality settings... No benchmark is complete without pics !

I'll get right on that.

Edit: Screenshots as well as an extra benchmark are up.
 
the funny thing is that i think half life 2 looks better at high settings, and it probably runs better.
 
iMac 24" ATI overclocking

Dear iMac friends,

I am also doing experiments with overclocking my ATI Mobility Radeon HD 2600 XT and I am successful with it.

Without changing the fan speeds I can set the core clock with the ATI Tool to 850 Mhz and the memory clock to 900 Mhz without any problems.

I also get a 3D performance increase of 20 % with this settings, and I think that´s really impressive.

Edit: I made my tests with the iMac 24", 2,8 Ghz, 4 Gbyte RAM.

Namnorkimo
 
Dear iMac friends,

I am also doing experiments with overclocking my ATI Mobility Radeon HD 2600 XT and I am successful with it.

Without changing the fan speeds I can set the core clock with the ATI Tool to 850 Mhz and the memory clock to 900 Mhz without any problems.

I also get a 3D performance increase of 20 % with this settings, and I think that´s really impressive.

Edit: I made my tests with the iMac 24", 2,8 Ghz, 4 Gbyte RAM.

Namnorkimo


Yeah i too have exact same specs, but when i opened up the AMDTool to overclock it already said clock speed 747.00. Does this mean that some models may come standard. Also i dont understand if the imac has the ati card, but it is soldered in apple's specific way, should there still be remains of how the card actually looks like e.g. casing as i have opened mac up many times to see a soldered on gpu. Many thanks in advance!
 
Hello Macnificent,

The ATI Mobility Radeon HD 2600 XT in the iMac uses a core-clock of 675 mhz and a memory clock of 747 mhz. This is lower than expected: In normal, this card uses a 700 mhz core and 750 mhz memory clock as we can verify here. Apple has clocked down the ATI for some reason. The ATI tool is a nice and easy way to get more 3D speed. In "Test Drive Unlimited" the fps increase is 26 %! Use the SmcFanControl to increase the fan speeds slightly if needed; the iMac will keep the setting alive when rebooting with Windows.
 
Hello Macnificent,

The ATI Mobility Radeon HD 2600 XT in the iMac uses a core-clock of 675 mhz and a memory clock of 747 mhz. This is lower than expected: In normal, this card uses a 700 mhz core and 750 mhz memory clock as we can verify here. Apple has clocked down the ATI for some reason. The ATI tool is a nice and easy way to get more 3D speed. In "Test Drive Unlimited" the fps increase is 26 %! Use the SmcFanControl to increase the fan speeds slightly if needed; the iMac will keep the setting alive when rebooting with Windows.
Fan speed and heat generation.

I remember clocking my Mobility X1600 just fine to stock speeds.
 
"Fan speed and heat generation."

You can rise the fan speeds with the SmcFanControl in MacOs before booting Windows to keep the Mac cooler. The CPU and the ATI seem to use the same heat pipe, so you´ll need to adjust the CPU fan only.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.