Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
real world thus far

With the new 3.06 Imac with the Gt 130 card I got yesterday I ran a full 25 man Naxx raid in WOW last night with everything set to maximum including full real time shadows @ 1900 x 1280 with 24 Bit Color -24 Bit Depth 4x multisample, V-Sync ON (which I am told on a mac slows it down some) and never had a drop below 55 fps. On the Imac with the 3.06 and the 8800 card I bought a few weeks ago and exchanged the new one for, I was getting an average of 32-38 FPS running the same settings. For fun this morning I ran a similar instance in WOW on my PC which has a very close Dual Core 2 processor, 4 gigs of RAM and a 640 meg 8800 Nvidia card. With that machine set on the same settings on a 24" Samsung HD monitor, I was getting an average of 18-25 FPS.
I don't know what exactly the stats are between the GT130 and the 8800 that was in the last model but in my real world test the latest version "stock" 3.06 Imac ran circles around the previous version with the 8800. The only other difference between the two is the extra 2 gigs of RAM that are in the new model, but I don't know for sure if i would see that big of an increase from RAM alone.
Plan on getting a copy of COD4 today and comparing it to a friends stock previous model and my new one.
 
With the new 3.06 Imac with the Gt 130 card I got yesterday I ran a full 25 man Naxx raid in WOW last night with everything set to maximum including full real time shadows @ 1900 x 1280 with 24 Bit Color -24 Bit Depth 4x multisample, V-Sync ON (which I am told on a mac slows it down some) and never had a drop below 55 fps. On the Imac with the 3.06 and the 8800 card I bought a few weeks ago and exchanged the new one for, I was getting an average of 32-38 FPS running the same settings. For fun this morning I ran a similar instance in WOW on my PC which has a very close Dual Core 2 processor, 4 gigs of RAM and a 640 meg 8800 Nvidia card. With that machine set on the same settings on a 24" Samsung HD monitor, I was getting an average of 18-25 FPS.
I don't know what exactly the stats are between the GT130 and the 8800 that was in the last model but in my real world test the latest version "stock" 3.06 Imac ran circles around the previous version with the 8800. The only other difference between the two is the extra 2 gigs of RAM that are in the new model, but I don't know for sure if i would see that big of an increase from RAM alone.
Plan on getting a copy of COD4 today and comparing it to a friends stock previous model and my new one.

Sounds like the GT 130 is slightly/or much better than the 8800 GS?
 
What about non-gamers that are all over EyeTV/HB transcoding?

So how does the difference play out for someone who isn't a gamer but tries to run a Handbrake DVD transcode while doing the same with and EyeTV show that's being moved to ATV compatible format? Any reason to bump the GPU for these apps, and if so how far (GT 120, 130, 130 512, 4850)? Same playing field under Snow Leopard?

Thanks.
 
@medic349 : would you mind sending the new drivers somewhere ? Thanks.
 
No way - no how - is the mobile 4850 is even remotely close to 50% faster than the mobile gt130. Day to day use 30% maybe - gaming/graphic intense apps 20% better at best.

I've seen many reports that on normal graphics tests the ATI's don't do much significantly better than the equivalent nVidia ... however, when the AA was cranked up the ATI's started to scale a lot better.... hence why you might see the increased performance on a very new benchmark that includes AA testing.
 
So we're still talking miles from the 4850 if I recall seeing the numbers, don't you think?
To be honest I don't believe that nVidia has anything that can compete with the HD48xx in the mobile space. The top of the line is just a mobile 9800GT/G92.
 
To be honest I don't believe that nVidia has anything that can compete with the HD48xx in the mobile space. The top of the line is just a mobile 9800GT/G92.

Appreciate the validation. Looks like we'll be jumping into the 4-6 week iMac queue today.
 
So we're still talking miles from the 4850 if I recall seeing the numbers, don't you think?

Actually it's pretty much confirmed to be some form of 9600GT (see my other thread). It registers as a G94 rev. A1, "pre-release 9600GT". Presently at Newegg the full cards run for about $85-100. The HD4850 cards run for $140-180, so the $50 price differential is about correct looking at the low end. Performance wise, I would say the ATI should be a pretty huge step up over the nVidia.

Still, it will be very interesting to see what version the final HD4850 release is. My guess is it will either be an underclocked or variant of the mobile version for power/heat savings. The results Apple showed don't seem to indicate it is vastly better than the 130. My guess is it will be something like 20% better benchmark wise, so maybe a 3DMark06 in the 11-12000 range.
 
Actually it's pretty much confirmed to be some form of 9600GT (see my other thread). It registers as a G94 rev. A1, "pre-release 9600GT". Presently at Newegg the full cards run for about $85-100. The HD4850 cards run for $140-180, so the $50 price differential is about correct looking at the low end. Performance wise, I would say the ATI should be a pretty huge step up over the nVidia.

Still, it will be very interesting to see what version the final HD4850 release is. My guess is it will either be an underclocked or variant of the mobile version for power/heat savings. The results Apple showed don't seem to indicate it is vastly better than the 130. My guess is it will be something like 20% better benchmark wise, so maybe a 3DMark06 in the 11-12000 range.
Still no GPU-Z. :(
 
...it will be very interesting to see what version the final HD4850 release is. My guess is it will either be an underclocked or variant of the mobile version for power/heat savings. The results Apple showed don't seem to indicate it is vastly better than the 130. My guess is it will be something like 20% better benchmark wise, so maybe a 3DMark06 in the 11-12000 range.

Even at modest benchmark estimates it still sounds like it's worth the 4-6 week wait. Would you agree?
 
Even at modest benchmark estimates it still sounds like it's worth the 4-6 week wait. Would you agree?

If you are patient and don't mind the wait, I would say it's definitely worth $50 for the upgrade. I'm kind of an nVidia fanboy myself, as well as being incredibly impatient, so I basically went to the store and decided if they had the GT130 in stock I'd bring it home right then and there.

If I were going strictly for bang/buck, I'd actually order the 2.93GHz, 640GB with 4850 for $1999. The processor bump is negligible IMHO and I don't really need the extra hard drive space.

On a side note, it's funny people bitch about Apple's prices, but if you go look for an "all-in-one" design comparable to the 24" iMac in the PC world, you'll find there is nothing much comparable in price/power. Sony has a nice entertainment based unit for $1899:

http://www.bestbuy.com/site/olspage.jsp?skuId=9097149&type=product&id=1218021922902

but it has 9300M GS graphics lol. Of course it does have Blu-Ray and other bells/whistles which is nice, but I wonder how good it will work all in all with the 9300.
 
With the new 3.06 Imac with the Gt 130 card I got yesterday I ran a full 25 man Naxx raid in WOW last night with everything set to maximum including full real time shadows @ 1900 x 1280 with 24 Bit Color -24 Bit Depth 4x multisample, V-Sync ON (which I am told on a mac slows it down some) and never had a drop below 55 fps. On the Imac with the 3.06 and the 8800 card I bought a few weeks ago and exchanged the new one for, I was getting an average of 32-38 FPS running the same settings. For fun this morning I ran a similar instance in WOW on my PC which has a very close Dual Core 2 processor, 4 gigs of RAM and a 640 meg 8800 Nvidia card. With that machine set on the same settings on a 24" Samsung HD monitor, I was getting an average of 18-25 FPS.
I don't know what exactly the stats are between the GT130 and the 8800 that was in the last model but in my real world test the latest version "stock" 3.06 Imac ran circles around the previous version with the 8800. The only other difference between the two is the extra 2 gigs of RAM that are in the new model, but I don't know for sure if i would see that big of an increase from RAM alone.
Plan on getting a copy of COD4 today and comparing it to a friends stock previous model and my new one.

Well it looks like you have on your pc a real problem? If you know that WoW is a bad example to do a test on your GPU. WoW is more CPU related then GPU. Just take a look on the box, you see that the recommended GPU is a Radeon 1600 or nVidia 7600 with 128 Mb Vram (those are specs from WotLK). You can take a look on your system stats and you will see that Wow really uses your CPU and just a small part of your GPU.
 
http://www.barefeats.com/imac09c.html

Cod4 and Xplane with the 4850 vs GT130. Basically, the 4850 eats the 130.

I don't understand this discussion. This GT130 is a rebrand of a low-end midrange card, and the 4850 is a brand new, modern low/mid-end highrange card.

Apple wouldn't have it as a BTO option unless there was a significant difference. (yes the small price is confusing but not everyone got the 3.06!)
 
http://www.barefeats.com/imac09c.html

Cod4 and Xplane with the 4850 vs GT130. Basically, the 4850 eats the 130.

I don't understand this discussion. This GT130 is a rebrand of a low-end midrange card, and the 4850 is a brand new, modern low/mid-end highrange card.

Chances are that you don't understand this discussion because you necro'd it from 2 months ago when nobody knew what anything really was quite yet. ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.