Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple's has GDDR3, doesn't the xt support GDDR4? Hp's could have GDDR4, maybe that along with the clock speed?

According to their website the mobility xt supports DDR2 and DDR3.

Are the cnet iMac tests in OS X?
 
It is more likely that we are deluding ourselves regarding the graphics card. As other have said, wait for the benchmarks. If the card in the new iMac is consistently giving poor scores then it really doesn't matter what id strings you can pull out of the drivers or read from the PCI-E bus.
 
It is more likely that we are deluding ourselves regarding the graphics card. As other have said, wait for the benchmarks. If the card in the new iMac is consistently giving poor scores then it really doesn't matter what id strings you can pull out of the drivers or read from the PCI-E bus.
Indeed. All I really want to know is: Can I play Simbin's GT Legends and MS flight simulator 2004 on my iMac at native resolutions. I don't really care about the name of the videochip.
 
The clock speeds aren't the only differences between the chips, though. How do the pixel shaders/pipelines etc compare in the three chips? (i.e. desktop pro, imac pro and desktop XT)
The chip in the iMacs listed as having the 2600 HD Pro is the exact same chip as the Radeon Mobility 2600 and the Radeon Mobility 2600 XT (if there were ANY differences the chip ID would have been changed).

The difference between the two Mobility cards is only in the clock of the GPU/Shaders and the memory (assuming GDDR3 memory for all), they have the exact same feature set.

The iMac has the same chip but appears to have different clocks, faster than the Mobility 2600 and slower than the Mobility 2600 XT. The frame rate in any given game under OS X will depend on the custom ROM, the Mac OS X drivers, the extent that the games takes advantage of multi-threaded OpenGL and the advanced features of the 2600 chip (such as the more flexible/powerful shader support).

On the windows side the final frame rate will depend on the custom ROM (probably not much to worry about here), will use the standard (or close to it) ATI drivers and run games that are more likely to be tuned using the advanced features of the chips resulting in higher frame-rates (with the possible exception of OSX games that use multi-threaded OpenGL).

The card in the iMac should be fine for older games, it just won't improve that much on the performance of older (but acceptable performance) 3D cards. On both the OSX and Windows XP/Vista side newer games that take advantage of the advanced features of the chip will run faster relative to old games running on the new chip so beware of estimating performance of modern games based on comparing performance will older games/older cards.

The '2600 Pro' in the iMacs will NEVER be in the same class as an aggressively clocked 8800 anything or even a 2900 desktop or mobility board. If you feel you NEED that kind of 3D gaming power then don't purchase an iMac.

If you can live with mid-to-upper end modern mobil performance, backed by a fully modern selection of chip features (will DX 10 support, ect), you will probably be OK with the '2600 Pro'.

The 2600 Pro in the iMac is not a 'power gaming' card but should allow you to play any modern game provided you can accept reducing your settings to boost your framer-rate on AAA titles.

How is that for a summary of where we are right now?
 
I think a lot of the reviews on the disappointing Pro desktop card were reviewing the GDDR2 cards. Big difference in memory speed there (almost double). The "Pro" label was a poor choice given the bad press of the desktop card. The HP review above is a 4GB 2.6GHz machine and as others have said Vista vs Tiger.

In the end the HD 2600XT is the fastest mobile DX10 GPU available today. Apple had no choice.
 
I think a lot of the reviews on the disappointing Pro desktop card were reviewing the GDDR2 cards. Big difference in memory speed there (almost double). The "Pro" label was a poor choice given the bad press of the desktop card. The HP review above is a 4GB 2.6GHz machine and as others have said Vista vs Tiger.

In the end the HD 2600XT is the fastest mobile DX10 GPU available today. Apple had no choice.

It was a good choice and its good news but why would they say 'pro' were they trying to AVOID confusion or create it?
 
It was a good choice and its good news but why would they say 'pro' were they trying to AVOID confusion or create it?

Apple's use of the word "Pro" was likely their lame attempt to capture the low end of the professional or prosumer market. In other words, a photographer might be more attracted to a Mac that has a "Pro" video card rather than a "Gamer" card. :rolleyes:
 
I think we need to hold judgment on the type of card until someone pulls the core and memory speeds off the card, and runs benchmarks. Then compares them to a known 2600PRO and a 2600XT.
 
I think we need to hold judgment on the type of card until someone pulls the core and memory speeds off the card, and runs benchmarks. Then compares them to a known 2600PRO and a 2600XT.

hey, no worries. glad to help. ive got to get the bloody thing to download again. hmm. when i double click the .download file it doesnt do anything. lol its only a few megs off the finished size too. guna have to google how to resume .download files in windows i guess.

EDIT: OK, so i download the official ATI control center, after what happened thought i should really take the official route. lol. here's what i found....

untitled3rg0.gif



hope thats any help. ill try figure out the 3Dmark download too now

Thanks Adom, that solves it. It is a Mobility XT like it's id says, it's just been underclocked. That suits me fine, I will leave it alone in OS X and clock it up to XT speeds when gaming in Windows.

(Core/Mem)
Desktop 2600 Pro: 550/700
iMac: 600/685
Mobility 2600 XT: 700/750

We've got most of the technical data that you're talking about. Now it just comes down to reliable real world tests, preferably comparing it to other apple offerings, which should be coming shortly from barefeats.com.
 
We've got most of the technical data that you're talking about. Now it just comes down to reliable real world tests, preferably comparing it to other apple offerings, which should be coming shortly from barefeats.com.

im in agreement with you, but one question, why would they underclock the memory MORE than the PRO? The core looks good, but why not put the memory at 700? what dont i understand, but regardless, I am happy with this card, and I retract my previous criticsm that I posted earlier last week.

Trout
 
im in agreement with you, but one question, why would they underclock the memory MORE than the PRO? The core looks good, but why not put the memory at 700? what dont i understand, but regardless, I am happy with this card, and I retract my previous criticsm that I posted earlier last week.

Trout

Well its Apple and they do things for no real reason. It could also be do to the heat of the card. Or maybe they are doing it on purpose, so that in a few months they can come out with a "gaming imac" and all they would have to do is put the card back to its retail speeds.
 
im in agreement with you, but one question, why would they underclock the memory MORE than the PRO? The core looks good, but why not put the memory at 700? what dont i understand, but regardless, I am happy with this card, and I retract my previous criticsm that I posted earlier last week.

Trout

The main reason I would guess is that they had to maximize the heat to performace ratio. With the new iMacs in smaller enclosure certain sacrifices had to be made. Keep in mind it may be safe to clock up the card to its original specs in bootcamp, emphasis on may be. The photos that have been circling of a disassembled iMac seem to indicate that there is no fan and apparently little or no heatsink on the graphics card, so overclocking would be a risky endeavor.

Still waiting optimistically on more benchmarks...

-Shepherd
 
The main reason I would guess is that they had to maximize the heat to performace ratio. With the new iMacs in smaller enclosure certain sacrifices had to be made. Keep in mind it may be safe to clock up the card to its original specs in bootcamp, emphasis on may be. The photos that have been circling of a disassembled iMac seem to indicate that there is no fan and apparently little or no heatsink on the graphics card, so overclocking would be a risky endeavor.

Still waiting optimistically on more benchmarks...

-Shepherd

yep me too! I was originally very upset with the graphics card, but I am over it now, my childish tantrum has subsided, and I am now thinking im ready to go and get one of these puppies.

trout
 
The main reason I would guess is that they had to maximize the heat to performace ratio. With the new iMacs in smaller enclosure certain sacrifices had to be made. Keep in mind it may be safe to clock up the card to its original specs in bootcamp, emphasis on may be. The photos that have been circling of a disassembled iMac seem to indicate that there is no fan and apparently little or no heatsink on the graphics card, so overclocking would be a risky endeavor.

Still waiting optimistically on more benchmarks...

-Shepherd
You do know that the GPU is attached to the largest heat sink and fan in the iMac, right?
 
I don't know if this had been mentioned BUT on the reviews of some of the games, Mainly Quake 4 off the top of my head. The FPS was around 40 I think. Now I don't know if they have updated the game and have it running optimized for the dual core processor. Just a thought but it adds about 20fps if not more on a PC.
 
To me it looks like it is a ATI Radeon Mobile HD 2600XT.

Why would they under-clock it you might ask? ATI has been known to under-clock Apple products, the main reason for this is because of less demand for these cards. If you buy a X1900 for your Mac Pro the stock PC version is higher clocked. The X1600 that was in the older iMac was also under-clocked.

You all really can't think heat is a issue, a iMac has much more room for airflow then a laptop. The reason it is under-clocked is just ATIs choice. Now I have ordered a 24" iMac and I am a little frustrated that ATI is half-assing it's video cards for Apple.

Apple lists it as a ATI Radeon HD 2600Pro because that is what it is closet to, it dose seem like it performs a fair bit better then a 2600Pro but not as good as the Mobile 2600XT, so if Apple listed it as what it really is a ATI Radeon Mobile HD 2600XT people would still be upset because it would not perform like a full Mobile 2600XT. Apple probably would rather list it as something lower and have people get it only to find it perform better then to list it as something higher and everyone gets it and finds it to be a disappointment.

In windows you should be able to clock the ATI Radeon Mobile HD 2600XT up to it's standard clock speed without any problems.

I'm still waiting for my 24" 2.8GHz iMac to ship but when I get it I will do some nice tests on it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.