Apple's has GDDR3, doesn't the xt support GDDR4? Hp's could have GDDR4, maybe that along with the clock speed?
aaahhh nice pick up on that. that could be the difference in the tests aye. nice work hahahaha
Apple's has GDDR3, doesn't the xt support GDDR4? Hp's could have GDDR4, maybe that along with the clock speed?
Apple's has GDDR3, doesn't the xt support GDDR4? Hp's could have GDDR4, maybe that along with the clock speed?
According to their website the mobility xt supports DDR2 and DDR3.
Are the cnet iMac tests in OS X?
Indeed. All I really want to know is: Can I play Simbin's GT Legends and MS flight simulator 2004 on my iMac at native resolutions. I don't really care about the name of the videochip.It is more likely that we are deluding ourselves regarding the graphics card. As other have said, wait for the benchmarks. If the card in the new iMac is consistently giving poor scores then it really doesn't matter what id strings you can pull out of the drivers or read from the PCI-E bus.
The chip in the iMacs listed as having the 2600 HD Pro is the exact same chip as the Radeon Mobility 2600 and the Radeon Mobility 2600 XT (if there were ANY differences the chip ID would have been changed).The clock speeds aren't the only differences between the chips, though. How do the pixel shaders/pipelines etc compare in the three chips? (i.e. desktop pro, imac pro and desktop XT)
ati cards are crap since the amd buyout, go ith an nvidia card
Do you know if the Nvidia GeForce 8800 Ultra would fit in the new iMac?
I think a lot of the reviews on the disappointing Pro desktop card were reviewing the GDDR2 cards. Big difference in memory speed there (almost double). The "Pro" label was a poor choice given the bad press of the desktop card. The HP review above is a 4GB 2.6GHz machine and as others have said Vista vs Tiger.
In the end the HD 2600XT is the fastest mobile DX10 GPU available today. Apple had no choice.
It was a good choice and its good news but why would they say 'pro' were they trying to AVOID confusion or create it?
I think we need to hold judgment on the type of card until someone pulls the core and memory speeds off the card, and runs benchmarks. Then compares them to a known 2600PRO and a 2600XT.
hey, no worries. glad to help. ive got to get the bloody thing to download again. hmm. when i double click the .download file it doesnt do anything. lol its only a few megs off the finished size too. guna have to google how to resume .download files in windows i guess.
EDIT: OK, so i download the official ATI control center, after what happened thought i should really take the official route. lol. here's what i found....
![]()
hope thats any help. ill try figure out the 3Dmark download too now
Thanks Adom, that solves it. It is a Mobility XT like it's id says, it's just been underclocked. That suits me fine, I will leave it alone in OS X and clock it up to XT speeds when gaming in Windows.
(Core/Mem)
Desktop 2600 Pro: 550/700
iMac: 600/685
Mobility 2600 XT: 700/750
We've got most of the technical data that you're talking about. Now it just comes down to reliable real world tests, preferably comparing it to other apple offerings, which should be coming shortly from barefeats.com.
im in agreement with you, but one question, why would they underclock the memory MORE than the PRO? The core looks good, but why not put the memory at 700? what dont i understand, but regardless, I am happy with this card, and I retract my previous criticsm that I posted earlier last week.
Trout
im in agreement with you, but one question, why would they underclock the memory MORE than the PRO? The core looks good, but why not put the memory at 700? what dont i understand, but regardless, I am happy with this card, and I retract my previous criticsm that I posted earlier last week.
Trout
The main reason I would guess is that they had to maximize the heat to performace ratio. With the new iMacs in smaller enclosure certain sacrifices had to be made. Keep in mind it may be safe to clock up the card to its original specs in bootcamp, emphasis on may be. The photos that have been circling of a disassembled iMac seem to indicate that there is no fan and apparently little or no heatsink on the graphics card, so overclocking would be a risky endeavor.
Still waiting optimistically on more benchmarks...
-Shepherd
You do know that the GPU is attached to the largest heat sink and fan in the iMac, right?The main reason I would guess is that they had to maximize the heat to performace ratio. With the new iMacs in smaller enclosure certain sacrifices had to be made. Keep in mind it may be safe to clock up the card to its original specs in bootcamp, emphasis on may be. The photos that have been circling of a disassembled iMac seem to indicate that there is no fan and apparently little or no heatsink on the graphics card, so overclocking would be a risky endeavor.
Still waiting optimistically on more benchmarks...
-Shepherd
You do know that the GPU is attached to the largest heat sink and fan in the iMac, right?