Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

aranhil

macrumors newbie
Original poster
Jul 26, 2007
9
0
Hi all. I'm looking at the new $1499 20" iMac. It features the HD 2600 (256MB) graphics card. For somebody who doesn't play games alot, is this new HD 2600 significantly better than the X1600 on the old iMacs? For me, the game I was going to play only requires 64MB Radeon 8500. So does this mean the new iMac hasn't really improved from the old one (apart from the slight increase in CPU and hard drive)? Cuz I'm really leaning toward buying a discounted old iMac instead.

BTW I never understood what a better graphics card do other than improving gaming experience? Do movies look better or something?
 

OTA

macrumors newbie
Aug 7, 2007
12
0
BTW I never understood what a better graphics card do other than improving gaming experience? Do movies look better or something?

So, take the 20´entry level and upgrade ram.
I think will be better for you
 

capran

macrumors member
Nov 28, 2003
93
0
BTW I never understood what a better graphics card do other than improving gaming experience? Do movies look better or something?

No, graphics cards have advanced to the point where 2D and generally video quality is very good even for low end parts. It's mostly only gamers like me who care about faster and more powerful GPUs. Although it may be possible for GPUs to integrate the newer video codecs like H.264, so video work will be accelerated.

As for how much better the newer 2x00 ATI cards are than the 1600, I'm not sure, but they're both low end graphics cards, and are in the $100 range or so for PCs.
 

BornAgainMac

macrumors 604
Feb 4, 2004
7,337
5,355
Florida Resident
Who knows for sure.

I wish the numbers meant something. Technically the newer card should be 1000 (units) better. If the cards about the same then maybe it should be called X1601. With the current numbering system, I have no way to judge how the card will perform.

In the future, ATI will have new cards and these will be the reactions:

In 2009, ATI HD 3300 XT in the new iMac. - Reaction: Sucks. Only has 512 MB. Yesterday's technology.

In 2010, ATI 4900 Ultra AX in the new iMac - Reaction: Sucks. Only has 1 GB of video ram.

In 2013, ATI 7710 Rage 3D Rated R in the new iMac - Reaction: Sucks. I am buying a Mac Pro for my gaming needs.

At least with the Core 2 Duo, the higher the Mhz, the faster it is.
 

ReanimationLP

macrumors 68030
Jan 8, 2005
2,782
33
On the moon.
The 2600 is actually basically the same as the X1600, with OpenGL 2.0 and DirectX 10 bolted on, 65nm instead of 90nm, and a slightly faster core and memory clocks.

Performance will probably be about 20-50% faster, depending on what game you're playing.
 

chewietobbacca

macrumors 6502
Jun 18, 2007
428
0
The 2600 is actually basically the same as the X1600, with OpenGL 2.0 and DirectX 10 bolted on, 65nm instead of 90nm, and a slightly faster core and memory clocks.

Pretty close. The 2600 also boasts the UVD, or universal video decoder, which shifts video decoding to the GPU and not the CPU. For example, the CPU load can be reduced from 50-60% all the way down to 10% while the GPU uses its own hardware to hardware decode H.264 for example. Pretty nifty for DVD playback and so on.

BornAgainMac said:
I wish the numbers meant something. Technically the newer card should be 1000 (units) better. If the cards about the same then maybe it should be called X1601. With the current numbering system, I have no way to judge how the card will perform

Actually AMD/ATI and NVIDIA both have pretty consistent number schemes. Although previously ATI used the X symbol in front of their cards for the previous two generations to symbolize they were in the ten thousands, that X is simply replaced with HD for high def I believe.

For example, the previous generation ATI Card was the X1000 series. Previously, it was the X800 series. And before that, it was the 9000 series. As you can see, the X simply states that they are in the 10,000's. The newest generation ATI card is of the HD2000 series.

For example, there is the HD2400, HD2600, HD2900, etc.

So the first number is the generation (in this case 2), the second number designates whether it is low/middle/high end (the higher the number, the higher end the card is. For example, the HD2900XT is the high end card, while the HD2600 would be the mid range, HD 2400 is the low end. Previously, the X1600 would be mid range while the X1950 would be the high end).

Also, the third number often designates whether its a refresh. For example, X850 and X1950 were refreshes of the X800 and X1900. Though this isn't always true, as the X1900 itself was a refresh of the X1800. This varies generation to generation depending on release date, level of competition, etc. The 2900 series has yet to have a refresh yet, though one is supposedly pending.

Finally, the letters at the end designate the performance level within that range. For ATI, IIRC it has traditionally been:

XTX > XT > PRO > XL > GTO > GT

So, for example, the X1950XTX was the highest performance king of the X1K series, which was better than the X1950XT, which was better than the X1950PRO and all the X19---- were the high end performance cards.

NVIDIA follows similar numbering schemes such as 8800 designates 8 series, the second 8 meaning high end (if they had a 8900, the 9 would be a refresh of the 8).

Hope that helps

NVIDIA follows this lettering: Ultra > GTX > GTS > GT > GS
 

Dont Hurt Me

macrumors 603
Dec 21, 2002
6,055
6
Yahooville S.C.
The 2600 is actually basically the same as the X1600, with OpenGL 2.0 and DirectX 10 bolted on, 65nm instead of 90nm, and a slightly faster core and memory clocks.

Performance will probably be about 20-50% faster, depending on what game you're playing.
good post minus your performance increase, I doubt you would see 20 -50% on any modern game made the past 2 years. I would say 2-5%
 

chewietobbacca

macrumors 6502
Jun 18, 2007
428
0
good post minus your performance increase, I doubt you would see 20 -50% on any modern game made the past 2 years. I would say 2-5%

2-10% in games is about right, with a significant improvement in video decoding speed and so on.

The HD2600PRO is best used for light gaming and home entertainment.

Personally, in my home theater PC, i'm getting a HD2400 PRO (cheap at $50 and very good at video decoding and quiet too) for that task myself :D
 

revenuee

macrumors 68020
Sep 13, 2003
2,251
3
does anyone know how the 2600 Pro, in the 24 inch, will perform running MOTION and Aperture?
 

iW00t

macrumors 68040
Nov 7, 2006
3,286
0
Defenders of Apple Guild
Can't be any worse than the intel integrated graphics though can it?

Nope. Intel graphics is the worst there is.

If any company tries to sell a GPU that loses out to Intel graphics they should just close up shop and shoot themselves, because IIG costs $1 to the mainboard manufacturer at worst.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.