Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...and it has a huge fan on it. So they had to go with the pro for size and then overclocked it to the point that the case would allow.

Again, out of my butt.

the end of your quote just made me Laugh out loud here at work, and now everone is looking at me!! that was funny.

trout
 
I'm sick of reviewers using such old games for testing... Doom 3 and Far Cry? Those were released 3 years ago, almost 4 for Far Cry. If your publication is so poor it can't afford new games please skip the graphics testing or at least use demos or something, I'll shoot myself the next time I see Far Cry in a GPU review.

We don't buy 2007 GPUs to play 2004 games.
 
excuse my ignorance about Mac *(This is my first one) But in the pc world we have utility's that we can install and be able to see the clock speed and memory type speed the video card has.

Do we have such utility's in the mac? Things like CPUID but for Video cards.

I suppose someone could boot into Windows and check with a utility.
 
excuse my ignorance about Mac *(This is my first one) But in the pc world we have utility's that we can install and be able to see the clock speed and memory type speed the video card has.

Do we have such utility's in the mac? Things like CPUID but for Video cards.

The reviewer is using Windows to benchmark, so you can still use Windows utilities to get a definitive answer on the GPU.

I'm sick of reviewers using such old games for testing... Doom 3 and Far Cry? Those were released 3 years ago, almost 4 for Far Cry. If your publication is so poor it can't afford new games please skip the graphics testing or at least use demos or something, I'll shoot myself the next time I see Far Cry in a GPU review

2 reasons - using different games every year or 2 makes it harder to benchmark current cards against older ones. And secondly, whilst the games may be older, plenty of games still come out using the Doom engine and the Far Cry engine.
Until new game engines come out, you might as well stick with games that use the current ones.
 
According to Macworld.com they all use GDR3 memory. The benchmarks for Unreal Tournament show that the Nvidia GeForce 7300 GT in the older iMacs slightly outperform the 2600 Pro. I'm assuming the result is far different for the 2400XT.
 
The reviewer is using Windows to benchmark, so you can still use Windows utilities to get a definitive answer on the GPU.



2 reasons - using different games every year or 2 makes it harder to benchmark current cards against older ones. And secondly, whilst the games may be older, plenty of games still come out using the Doom engine and the Far Cry engine.
Until new game engines come out, you might as well stick with games that use the current ones.

No games besides Far Cry are coming out on the Far Cry engine so that point is moot. And because you can run Doom 3 doesn't mean you can run every Doom 3 engine game or how well, at the very least they could use a newer Doom 3 game like Prey or something.

And how about testing game engines like the F.E.A.R. engine? The Oblivion engine? The STALKER engine? The Unreal 3 engine? Call of Duty 2 engine? Battlefield engine? You know, things from this century :rolleyes: All I'm asking for is use recent games, honestly what good does it do if I know I can play Far Cry at 100fps when I played that game already 3 years ago?
 
Macworld has a review too...

Results of the UT2004 test are... shocking. The new imacs were outperformed by the old 24"er. "ok, so what?" you may ask.


Wait for it.....




The old 24" with the nvidea 7300.


:eek::eek::eek:
 
wow that is extremely weak. That means the 24 inch iMac I bought in October of 2006 easily outperforms the newest iMac in terms of video games? It had the 7600 in it. How is this possible?
 
Results of the UT2004 test are... shocking. The new imacs were outperformed by the old 24"er. "ok, so what?" you may ask.


Wait for it.....




The old 24" with the nvidea 7300.


:eek::eek::eek:


The test itself was performed at 1024 x 768 with no AA or AF. Hence i would hardly call this test conclusive. I have seen numerous benchmarks were older less powerful cards marginally outperform newer more powerful cards at low settings, such as those benchmarked, only to be soundly beating at higher resolutions with AA and AF enabled.

I would wait for the Barefeats benchmarks before anyone passes judgement. They are second to none in the Mac world.
 
I would wait for the Barefeats benchmarks before anyone passes judgement. They are second to none in the Mac world.[/QUOTE said:
here. here. but rob-art said they'd be posted earlier NEXT week!! do you know how many hours away that is??!!

the suspense.........:(
 
So why, aside from the obvious making fun of C|Net, did the iMac do so well on PC World's tests and so poorly on C|Net's? Is there that substantial a difference between the 2600XT and what the 20" iMac they tested had? (I'm not a PC gamer, so sorry if I'm asking an obvious question! :( )
 
I got my 2.4 24" two days ago - now with 4gb of Kingston RAM in it.

Just tried Call of Duty 2 Demo for you guys (I'm not a gamer) and I set the vid res in the game to same as the screen 1920 by 1200 and got my son to put the frame rate thing on.

It hovered around 92 fps most of the time, down to 36 fps at the end when I died under a hail of bullets.

Seems alright to me, what else can I try?
 
The test itself was performed at 1024 x 768 with no AA or AF. Hence i would hardly call this test conclusive. I have seen numerous benchmarks were older less powerful cards marginally outperform newer more powerful cards at low settings, such as those benchmarked, only to be soundly beating at higher resolutions with AA and AF enabled.

I would wait for the Barefeats benchmarks before anyone passes judgement. They are second to none in the Mac world.

I agree. I want to know how some of the most demanding games perform at the native resolution of the LCD panel. Everybody knows that when you use a lower than native resolution it needs to be scaled/blended, and it tends to look blurry. Add to that the fact that we're talking about widescreen 16:10 ratio, and the resolution they tested at (1024x768) will look even more terrible.

I wouldn't want to play a game at a blurry low-res, stretched out and distorted resolution with graphics options turned down. It would suck.

Bottom line, what games & settings are playable (meaning a consistent 30 fps) at 1680x1050 or 1920x1200 for the 24"? I realize these are relatively high resolutions, and that the 2600 Pro isn't benchmarking well on other sites.
 
Results of the UT2004 test are... shocking. The new imacs were outperformed by the old 24"er. "ok, so what?" you may ask.

OH that's not good.

By the way, there was an option to update to the 7600GT from the stock 7300 on the old 24".

I wonder how the new one would do against the 7600 when it gets trounced by the 7300
 
I got my 2.4 24" two days ago - now with 4gb of Kingston RAM in it.

Just tried Call of Duty 2 Demo for you guys (I'm not a gamer) and I set the vid res in the game to same as the screen 1920 by 1200 and got my son to put the frame rate thing on.

It hovered around 92 fps most of the time, down to 36 fps at the end when I died under a hail of bullets.

Seems alright to me, what else can I try?

That sounds decent!

Yeah, I'm much more interested in fps at full screen resolution. I can't stand the blurriness at non-native rez's... and I don't feel the need for any AA when you're playing at such high rez's. Something tells me the newer cards will differentiate themselves better at these higher rez's as well.
 
PC World have reviewed the 24" version of the new iMac and seem pretty impressed with it, especially the design which they say:

"...dispenses with the previous version's glossy kitsch in favor of glossy elegance..."

Although they do say that the Mighty Mouse now looks out of place with the aluminium keyboard and case.

I've spent 3 years studying communications:

PC Magazine is dishing-out the time-honoured; spin-doctored methods of (covert) Mac "analysis":

I.e., They do state the INDISPUTABLE excellent points; but intersperse those points with (very) minor "negative" points:

E.g., "Keyboard too low"; and: "USB2.0 ports at rear of the keyboard!" FCS! How long does it take to learn how to plug in a USB2 device into the rear of a keyboard (rather than the side)...

Eventually if a sufficient number of (very) minor "negatives" are included in a review, the real qualities of a product are (in the mind of the reader) polluted...

Reminds me of a (recent) PC Mags "100 reasons why the iPhone doesn't make it"!

This is (sadly still) typical PC Mag stuff: - most computer bigots are on the klunky (still!) PC side...
 
Paranoia !

PC Magazine is dishing-out the time-honoured; spin-doctored methods of (covert) Mac "analysis":

I.e., They do state the INDISPUTABLE excellent points; but intersperse those points with (very) minor "negative" points ...

... This is (sadly still) typical PC Mag stuff: - most computer bigots are on the klunky (still!) PC side...

Some of the comments on this site really crack me up. Even when a reviewer gives an Apple product 8 or 9 out of 10 there'll be the usual anger from some people over why it wasn't given the full 10/10. How dare any negative points be brought up! This is normally followed by an explanation for the "low" score, e.g. the reviewer works for / is sponsored by / biased towards PC's / Microsoft.

Some of you are so paranoid; just cos an Apple product doesn't get full marks doesn't mean there's some global conspiracy with Dell and Gates lining the pockets of every technology reviewer with bribes!

Cheer up mate, I thought it was a positive review and has definitely swayed me more towards the new iMac over a Macbook Pro. :D
 
Brings up the question will my current iMac Core 2 Duo 2.16 GHz with the X1600 256MB card be able to play that high definition.

I don't see why not, I play HD DVDs in Windows on my MBP (2.33 GHz, 256 MB X1600) with no problems whatsoever.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.