Ok... I was doing some of the oft-requested overclock vs stock clock 24" 7600GT XP Gaming Benchmarks... when things began to be weird...
I'm wondering if Apple optimized the firmware or other pieces/parts of the 7600GT for the super-high 1920x1200 resolution of the 24" monitor.
I had chosen QUAKE4 as the test bed for my overclocking experiments...
I decided to throw a LOT at QUAKE4 and see just how bad the frame rates would get.
I wasn't dissapointed
I had the resolution set to 1920x1200, Wide Screen, Medium Texture Quality, No Vertical Sync, all options on except the top choice (special effects I think), and also 16x FSAA.
Thats right, FSAA.
I figured that would show the best benchmark for overclocking
I wasn't dissapointed.
I chose the opening scene in a new game where the soldier is floating, half blown to bits, in space.
With all the above... I got 15FPS MAX and 13FPS MIN during that scene.
I overclocked using nVidia's autodetect settings, and got the following change:
18FPS MAX and 17FPS MIN.
That's about a 20% Increase in FPS... of course... 3FPS isn't much, but when you only have 15 to start with... it's pretty much REALLY WELCOME
(Oh, just FYI... the same settings above only with 2x FSAA instead of 16x FSAA gets you a solid 60FPS )
Now... HERE is where it gets weird... and where I thought some of you might find this interesting... as you are specifically choosing the 20" for a better *gaming* system as the video card will not have to render as many pixels, so theoretically should be faster at the lower resolution... esp. if you are not planning on upgrading to the 7600GT.
So... I thought I'd compare the stock clock rates of the video card @1920x1200 to the 1600x900 mode (next resolution down)... ALL OTHER SETTINGS THE SAME... EVEN THE 16x FSAA...
And... drumroll...
MAX 9 FPS, MIN 6FPS.
The LARGER resolution was about 2x as fast as the next-smaller resolution...
I'm going to do further testing... but would love to see someone else try this and see if it is a bug, or... do you think Apple REALLY optimized the firmware / silicon for the 24" iMac?
It can't be the drivers, I do not believe, as I'm using the Apple-Supplied Graphics Drivers (which APPEAR to be the generic GENERAL RELEASE nVIDIA DRIVERS... and are the latest NON-BETA versions as well!!!)...
What do you guys think?
EDIT: I have rerun the tests 2X now going back and forth between the resolution settings... it IS repeatable on my machine...
hmmmmmmmmmmmmmm
I'm wondering if Apple optimized the firmware or other pieces/parts of the 7600GT for the super-high 1920x1200 resolution of the 24" monitor.
I had chosen QUAKE4 as the test bed for my overclocking experiments...
I decided to throw a LOT at QUAKE4 and see just how bad the frame rates would get.
I wasn't dissapointed
I had the resolution set to 1920x1200, Wide Screen, Medium Texture Quality, No Vertical Sync, all options on except the top choice (special effects I think), and also 16x FSAA.
Thats right, FSAA.
I figured that would show the best benchmark for overclocking
I wasn't dissapointed.
I chose the opening scene in a new game where the soldier is floating, half blown to bits, in space.
With all the above... I got 15FPS MAX and 13FPS MIN during that scene.
I overclocked using nVidia's autodetect settings, and got the following change:
18FPS MAX and 17FPS MIN.
That's about a 20% Increase in FPS... of course... 3FPS isn't much, but when you only have 15 to start with... it's pretty much REALLY WELCOME
(Oh, just FYI... the same settings above only with 2x FSAA instead of 16x FSAA gets you a solid 60FPS )
Now... HERE is where it gets weird... and where I thought some of you might find this interesting... as you are specifically choosing the 20" for a better *gaming* system as the video card will not have to render as many pixels, so theoretically should be faster at the lower resolution... esp. if you are not planning on upgrading to the 7600GT.
So... I thought I'd compare the stock clock rates of the video card @1920x1200 to the 1600x900 mode (next resolution down)... ALL OTHER SETTINGS THE SAME... EVEN THE 16x FSAA...
And... drumroll...
MAX 9 FPS, MIN 6FPS.
The LARGER resolution was about 2x as fast as the next-smaller resolution...
I'm going to do further testing... but would love to see someone else try this and see if it is a bug, or... do you think Apple REALLY optimized the firmware / silicon for the 24" iMac?
It can't be the drivers, I do not believe, as I'm using the Apple-Supplied Graphics Drivers (which APPEAR to be the generic GENERAL RELEASE nVIDIA DRIVERS... and are the latest NON-BETA versions as well!!!)...
What do you guys think?
EDIT: I have rerun the tests 2X now going back and forth between the resolution settings... it IS repeatable on my machine...
hmmmmmmmmmmmmmm