Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Macworld has a new review up on the new iMacs. It has an interesting portion:
The new iMacs use ATI Radeon HD series graphics processors. The graphical performance of both the Radeon HD 2400 XT and the Radeon HD 2600 leaves the new iMac in the mainstream performance category when it comes to games and 3-D applications, but it also paves the way for future capabilities. The Unified Shader Architecture touted by Apple and ATI/AMD will make it easier for game developers and others to show off fancy new special effects in their software. The new chips can also perform 128-bit High Dynamic Range (HDR) rendering, which will give games more intense, realistic lighting and shadows. As with the low-end model, the only place where the faster iMacs really fell behind was in the Unreal Tournament test (albeit only slightly), illustrating that the new graphics have a lot of future potential that’s not showing up in our tests—which will make your iMac last longer before becoming outdated (a very big concern in the tech world).
 
The only thing, I really wonder: This card should have shader model 4.0, not 3.0 - why is it reported as 3.0??? Is this a restriction of the mobility version, or what is going on here?
Can anyone tell me?

SM4 only exist for D3D10. If you are not running Vista with the D3D10, then you only get SM3 support.
 
So I searched for ATI Mobility Radeon™ HD 2600 Pro and I don't think it exists. I honestly think it might just be a marketing term. 2600 Pro sounds better than 2600 XT to the consumer.

Yes, 2600PRO sounds so much better than 2600XT. To be honest I thought when I saw 2600PRO, I figured it was the higher end card version of ATi's new mid level cards. But most people due research before buying a computer and apple would be stupid to think that no one will find out how bad the actual 2600PRO card is. If it was a marketing ploy, not only was it a bad one but it shows that apple thinks that a large chunk of its customers are dumb and will blindly buy anything they make.
 
why do ATI clock down the cores??!!??!! is it because of heat, or cauz of smething else??? i dont really see it as a disadvantages, just a hassle. there are many tools that allow people to change the clocks.

ATi doesn't know what they are doing. They have dropped the ball many times in the past and the only reason they have had the upper hand for a little while is that Nvidia got cocky and made really bad graphics card a couple generations ago (remember the FX5200 cards anyone?).

Also AMD purchased ATi because they are desperate as intel is crushing them in the cpu market.

What I am hoping for is that the drivers that ATi have released for the cards are very immature and they will take the time to optimize them and we will see a nice performance boost in the next few months.

I really want to pick up an imac
 
ATi doesn't know what they are doing. They have dropped the ball many times in the past and the only reason they have had the upper hand for a little while is that Nvidia got cocky and made really bad

I think that the HD2400 & HD2600 series might be remembered as a similar damp squib.
 
3dmark scores?

I'm just curious if anyone has seen 3dmark scores for the new imac's .. I know adom was suppose to post some but I haven't seen them (or I just missed it)..

I'd be curious what the 3dmark scores are in bootcamp .. granted people should take 3dmark with a huge grain of salt but it might help in trying to get more evidence to support if it is the XT chip.


I've been trying to search the web to find any posted scores with little success.
 
Consumers

Consumers don't care about the chip that's running their graphics. Go to Best Buy - you'll find entry level chips from 2-3 generations ago sitting on shelves touting the fact that they have 256 MB of RAM on them - that's all people care about. They don't care that the X1300 pro that they just picked up can't take even take full advantage of 128 MB of RAM, let alone 256 MB.

And frankly, I don't blame them for not knowing what GPU is faster. The Radeon 8500 was faster than the Radeon 9000 Pro which was faster than the Radeon 9200 - but from looking at the numbers, can you tell? The Radeon 9500 Pro was faster than the 9600 XT which was faster than the 9650 Pro - go figure that one out.

The average home consumer will see fancy Quartz Extreme swooshes, dashboard ripple effects, and smooth HD video playback, and they will see that the iMac has 256 MB of graphics RAM, and be impressed. Hell, they're impressed when Intel GMA950 claims 256 MB of shared memory... welcome to retail and catering to the lowest common denominator.

Though it would really have been nice to have the option for something faster. They should have just kept the nVidia MXM module and given us the option of plugging an 8800 of some sort in there, with some kind of 8600 as a standard card.
 
Well more than likely it is a 2600XT that has had its core and memory speeds reduced.

Yes!... now you're getting it :D

And that answers the question everyone is trying to answer here.

Hardware wise, we have a mobility 2600 XT (9583). There is no "mobility 2600 pro" hardware in existence.

However - it is possible to take a mobility 2600 XT (the top of the line mobility card), underclock it a bit so that it's speeds roughly match those of a "desktop 2600 pro" - and thus have it mimic/perform as one.

As discussed already, the main difference between these cards, from a hardware point-of-view, is that some are capable of handling different clock speeds. This doesn't mean a card cannot - for whatever reason - be clocked at less then its potential, and thus mimic a less powerful card.


Make more sense?


The bad news is that apple released this video card with clock speeds that roughly match those of a desktop-class 2600 Pro.

The good news is that the hardware inside actually appears to be a mobility 2600 XT, which in turn means it is inherently capable of running somewhere in the neighborhood of 30% faster then what you get running it out of the box.


Also, the initial bare feats tests are up... though keep in mind that it does not appear that he has taken any of this subject material into account, and is more then likely running the tests at apple's default clock speeds. Kind of odd results, notwithstanding.
 
Good news: The new card will probably run the new mac games very good (EA and other company's will of course optimize as much as possible as iMac is the most gaming capable mac if you look at the price. (I guess most consumers choose iMac as well)

"The new chips can also perform 128-bit High Dynamic Range (HDR) rendering, which will give games more intense, realistic lighting and shadows. As with the low-end model, the only place where the faster iMacs really fell behind was in the Unreal Tournament test (albeit only slightly), illustrating that the new graphics have a lot of future potential that’s not showing up in our tests—which will make your iMac last longer before becoming outdated (a very big concern in the tech world)."

Bad news: It may run older games worse than the old imac's graphicscard, even though the new games will run better on the new iMac than the older one.
 
Good news: The new card will probably run the new mac games very good (EA and other company's will of course optimize as much as possible as iMac is the most gaming capable mac if you look at the price. (I guess most consumers choose iMac as well)

A very good point, Instead of having to optimize a game for countless video cards games for a mac will be very optimized to get the best out of them.


....Hopefully.
 
Also, the initial bare feats tests are up... though keep in mind that it does not appear that he has taken any of this subject material into account, and is more then likely running the tests at apple's default clock speeds. Kind of odd results, notwithstanding.


I hope you are right as those graphs don't look very promising..

Perhaps I will wait for the REV 2 2007 iMac ... hopefully they will bump the graphics chip used to something a bit better..

Still would like to see some 3dmark numbers (both at normal and "over" clocked speed)
 
Good news : you can return the new iMac to Apple store as soon as it arrives !

Bad news: It seems to take long time until Apple understands that it needs to put decent GPU in to machine too
 
Yes!... now you're getting it :D

And that answers the question everyone is trying to answer here.

Hardware wise, we have a mobility 2600 XT (9583). There is no "mobility 2600 pro" hardware in existence.

However - it is possible to take a mobility 2600 XT (the top of the line mobility card), underclock it a bit so that it's speeds roughly match those of a "desktop 2600 pro" - and thus have it mimic/perform as one.

As discussed already, the main difference between these cards, from a hardware point-of-view, is that some are capable of handling different clock speeds. This doesn't mean a card cannot - for whatever reason - be clocked at less then its potential, and thus mimic a less powerful card.


Make more sense?


The bad news is that apple released this video card with clock speeds that roughly match those of a desktop-class 2600 Pro.

The good news is that the hardware inside actually appears to be a mobility 2600 XT, which in turn means it is inherently capable of running somewhere in the neighborhood of 30% faster then what you get running it out of the box.


Also, the initial bare feats tests are up... though keep in mind that it does not appear that he has taken any of this subject material into account, and is more then likely running the tests at apple's default clock speeds. Kind of odd results, notwithstanding.
Barefeats results are odd. They state that the results are similar using 1900x1200. Hmm, drivers need serious work. Especially if the 8600GT is running at similar speeds to the 2600XT.
 
Just FYI for everyone, new ATI drivers were released *yesterday* that is supposed to better support these new cards in Windoze.

http://ati.amd.com/support/driver.html

The following performance improvements are noticed with Catalyst™ 7.8:

* Radeon™ HD2600 and Radeon™ HD2400 (single and CrossFire™) show improvements of as much as 6% in 3DMark 2006 and 3DMark 2005
* Serious Sam 2 with HDR enabled improves as much as 14.5% on Radeon™ HD2900, Radeon™ HD2600 and Radeon™ HD2400 products

That's interesting. Can we expect further increases in later games.
 
Why not turn it into a large base unit so they can fit a top of the line desktop graphics card in it to make the graph lovers happy... wait...
 
Well its ID is of the Mobility chip but Apple touts it as the desktop Pro.

It's more then likely just an underclocked Mobility HD2600 XT.

Hmm, do you think that the 24" iMac case could handle the thermal and electrical requirements of the 8800GTS?

Why not turn it into a large base unit so they can fit a top of the line desktop graphics card in it to make the graph lovers happy... wait...
:D
 
DOOM 3 BENCHMARKS

Under Windows XP Pro SP2 on a 2.4Ghz Core 2 Duo 20" iMac. Windows reports the card as a 2600XT.

All on high quality.

1024 x 768 92.8FPS
1280 x 1024 64.7FPS
1600 x 1280 50.7FPS

All just using Bootcamp 1.4 drivers.

As a comparison, my old Pentium4 HT 3.0Ghz with 512Mb GeForce 6800 card managed 87FPS at 1024 x 768 in Doom3.


Note that the new drivers for the '2600 series' released yesterday do not work in Windows. It says there is no supported hardware on board and will not install the drivers.
 
DOOM 3 BENCHMARKS

Under Windows XP Pro SP2 on a 2.4Ghz Core 2 Duo 20" iMac. Windows reports the card as a 2600XT.

All on high quality.

1024 x 768 92.8FPS
1280 x 1024 64.7FPS
1600 x 1280 50.7FPS

All just using Bootcamp 1.4 drivers.

As a comparison, my old Pentium4 HT 3.0Ghz with 512Mb GeForce 6800 card managed 87FPS at 1024 x 768 in Doom3.


Note that the new drivers for the '2600 series' released yesterday do not work in Windows. It says there is no supported hardware on board and will not install the drivers.

Interesting..

I don't suppose you could download the free 3dmark 06 benchmark tool and run it to get some numbers... ? I am really curious if the new iMac is comparable to my 3+ year old 2.5ghz P4/Nvidia 7800 GS 256mb {agp} gaming pc.. . I'd like to really dump that old machine .. I know what score I get for that system (2779 - 3dmark 06), I'd be curious to see what the new iMac bench up as. . If the new iMac's are close then I would get the new iMac, otherwise I am just gonna wait and deal with using the mini and the PC.

the link if you need it..
http://www.futuremark.com/download/
 
So basically... apple supply us with older chips like this... so apple fans are the lowest common denominator?

Not saying that Apple fans are the lowest common denominator - but that average consumers are. The people that post on a computer oriented forum are only a very small minority. They'd care more about extra graphics RAM vs. faster GPU - if you look at most Best Buy, Comp USA, or Circuit City advertisements, they don't list the GPU at all (the vast majority have Intel graphics) - consumers only care about the CPU.

This iMac looks like it was designed by marketing - high speed processors, large amounts of graphics RAM, and big screens vs. quality screens (as the 20" iMac used to have).
 
DOOM 3 BENCHMARKS


Note that the new drivers for the '2600 series' released yesterday do not work in Windows. It says there is no supported hardware on board and will not install the drivers.

Weird - I thought, until i saw "The following performance improvements are noticed with Catalyst™ 7.8:

* Radeon™ HD2600 and Radeon™ HD2400 (single and CrossFire™) show improvements of as much as 6% in 3DMark 2006 and 3DMark 2005
* Serious Sam 2 with HDR enabled improves as much as 14.5% on Radeon™ HD2900, Radeon™ HD2600 and Radeon™ HD2400 products" - Pro is in there, but can't detect iMac's PRO card - because it's an XT :p

:p iMac graphics is XT for sure !, let's just hope for a driver update in a near future ! :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.