Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Upon installing Parallels, I immediately regretted not going with 256 MB, because the most VRAM I can provide Vista with is 64 MB, and that's just not enough.

Of course, it runs fine natively in Boot Camp, when it can use all the resources itself... But when split, performance ain't so hot.
 
Ok I have to go right now but when I'm back I'll find a link to prove my point. Apple has indeed underclocked it a lot. Even if the 7600 is a desktop card its an older series and if you find the benchmarks on the cards you'll see the 8600M GT scores some pretty good scores in 3DMark06 and it has said to on par with the X1900. Besides hasn't Apple always underclocked their cards for heat issues and the slimness of the MBP's? I mean even the X1600 was underclocked. Trust me I know the card has been undercloced by at least 30% or more. :apple:

Trust you?

People who actually have these machines have ALREADY verified the core and memory clock speeds, and they aren't as underclocked as you say.

I'll trust people who actually own the machines.

The performance is on par with the 256MB 8600M GT in the Asus G1S (which from reports I've ready is actually OVERCLOCKED a bit).

-Zadillo
 
Trust you?

People who actually have these machines have ALREADY verified the core and memory clock speeds, and they aren't as underclocked as you say.

I'll trust people who actually own the machines.

The performance is on par with the 256MB 8600M GT in the Asus G1S (which from reports I've ready is actually OVERCLOCKED a bit).

-Zadillo

Ok. Well I guess I could be wrong. But didn't Apple say that the 8600M GT in the MBP's are up to 57% faster than the MBP's with X1600? If that was the case wouldn't that mean that they have underclocked it quite a bit being that 1. The 8600M GT over twice as much in the benchmarks with 3DMark06. The x1600 gets about 1800 while the 8600M GT gets around 3700. Thats more than 57%. And also in this link http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html the 8600M GT is ahead of the 7600GT which in barefeats test is well ahead of the 8600M GT, and from those scores it shouldn't eh? Feel free to prove me wrong as I'm just saying from what "I've" seen. You probs know way more than me so looking 4ward to a response from you man :) I just thought the 8600M GT was a bit better than what I saw on barefeats.(Although that was through OS X. Games run better on XP/Vista than on Mac) Well anyway nice day to you ;)
 
Ok. Well I guess I could be wrong. But didn't Apple say that the 8600M GT in the MBP's are up to 57% faster than the MBP's with X1600? If that was the case wouldn't that mean that they have underclocked it quite a bit being that 1. The 8600M GT over twice as much in the benchmarks with 3DMark06. The x1600 gets about 1800 while the 8600M GT gets around 3700. Thats more than 57%. And also in this link http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html the 8600M GT is ahead of the 7600GT which in barefeats test is well ahead of the 8600M GT, and from those scores it shouldn't eh? Feel free to prove me wrong as I'm just saying from what "I've" seen. You probs know way more than me so looking 4ward to a response from you man :) I just thought the 8600M GT was a bit better than what I saw on barefeats.(Although that was through OS X. Games run better on XP/Vista than on Mac) Well anyway nice day to you ;)

Apple's scores and barefeats are based on benchmarking in OS X, most likely.

In Windows, it would probably be different, and people have seen the full power of the 8600M GT in Windows in their tests.

Also, you are making a mistake in comparing barefeats' results with the notebook graphics card comparison link you mentioned.

BareFeats compared it with the desktop 7600GT in an iMac; the 7600 in the notebook comparison chart is the mobile version, which is not as powerful.

Anyway, the point is, as others have already seen, the actual 8600M GT is not really that underclocked, and people are getting very good performance out of it in Boot Camp, on par with the 8600M GT in the Asus G1S (which is pretty impressive given th at the G1S's 8600M GT is actually slightly overclocked, not to mention that the G1S is 6.8 pounds and considerably thicker).
 
Apple's scores and barefeats are based on benchmarking in OS X, most likely.

In Windows, it would probably be different, and people have seen the full power of the 8600M GT in Windows in their tests.

Also, you are making a mistake in comparing barefeats' results with the notebook graphics card comparison link you mentioned.

BareFeats compared it with the desktop 7600GT in an iMac; the 7600 in the notebook comparison chart is the mobile version, which is not as powerful.

Anyway, the point is, as others have already seen, the actual 8600M GT is not really that underclocked, and people are getting very good performance out of it in Boot Camp, on par with the 8600M GT in the Asus G1S (which is pretty impressive given th at the G1S's 8600M GT is actually slightly overclocked, not to mention that the G1S is 6.8 pounds and considerably thicker).

Doens't the 24" iMac hace the mobile version of the 7600GT in it anyway. I remember when it first came out it was using the "upgradable" standard for laptop GPU's.
 
Doens't the 24" iMac hace the mobile version of the 7600GT in it anyway. I remember when it first came out it was using the "upgradable" standard for laptop GPU's.

No, this has been a source of confusion though.

The iMac does use MXM, which is a standard for upgradeable mobile graphics. But it has a desktop version of the 7600GT connecting through MXM. It doesn't use the mobile 7600GT.
 
No, this has been a source of confusion though.

The iMac does use MXM, which is a standard for upgradeable mobile graphics. But it has a desktop version of the 7600GT connecting through MXM. It doesn't use the mobile 7600GT.

Cool. Thanks for the clarification.

So when i can i buy an upgrade card for my 24" iMac?
 
No, this has been a source of confusion though.

The iMac does use MXM, which is a standard for upgradeable mobile graphics. But it has a desktop version of the 7600GT connecting through MXM. It doesn't use the mobile 7600GT.

They're effectively identical parts, with one being run at a lower voltage and the other being run at higher clockspeeds.

Otherwise, same memory bus type and width, same number of TMUs, pixel and vertex shaders, raster operators, etc.

So, it doesn't really matter which chip is on there, just what the timings are -- and with mobile parts that's always what it's about, since they vary wildly based on a variety of factors such as enclosure type, preferred battery life, etc. The specced timings are just "manufacturer's recommendations".
 
Zadillo can u give me a link to how the 8600M GT performs in bootcamp or on the Asus G1S? plz. I wanna know. Also don't 4get that the 8600M GT doesn't have that many drivers out for it yet so that could increase performance as well.:D
 
Ok I have to go right now but when I'm back I'll find a link to prove my point. Apple has indeed underclocked it a lot. Even if the 7600 is a desktop card its an older series and if you find the benchmarks on the cards you'll see the 8600M GT scores some pretty good scores in 3DMark06 and it has said to on par with the X1900. Besides hasn't Apple always underclocked their cards for heat issues and the slimness of the MBP's? I mean even the X1600 was underclocked. Trust me I know the card has been undercloced by at least 30% or more. :apple:

We have the actual timings and it's not underclocked by anything like that.

Firstly, if you look up the 7600GT desktop part specs, you'll find it's capable or pretty similar theoretical speeds as the 8600GT.

Secondly, you're comparing 3DMark06 scores. 3DMark06 is a synthetic benchmark that attempts to calculate what sort of performance a card might have with future games by running various tests and weighting the scores from each test.

In the case of the '06 version, shader performance is quite heavily weighted compared because future games are expected to use shaders much more heavily. The 8600M GT shaders on paper should have MUCH better performance when compared to the 7600GT, so it's likely that which accounts for the 3DMark06 score.

Note that 3DMark scores in general have frequently not reflected real world performance accurately. Just recently some reviews showed the new ATI HD 2900XT card as being close to the GeForce 8800 GTS. The same reviews showed the latter would also best the former by 25-30% in some games under the right circumstances.

3DMark doesn't really tell us much beyond how well a certain card runs 3DMark.

Most current games also aren't heavily using shaders. So, a large part of the 3DMark score is coming from a part of the GPU that's not really being taxed. And if we ignore the shaders, the 7600GT in desktop form is on a par with the 8600M for raw fillrate. It should beat it at texturing by dint of more TMUs, but otherwise they're pretty much on a par.

However, that's assuming the drivers are up to scratch. The G7x arch of the 7600GT is a mature arch that is heavily derived from the previous generation, NV4x (ie 6x00 series). This is a well known and understood arch from being around so long, and the drivers are mature and tweaked for best performance. At least on Windows, the drivers also contain a variety of optimisations for specific titles in order to ensure that they perform as effectively as possible.

The G8x series is new, and the "cut-down" parts (ie not 8800 series) have literally just been launched. The drivers are, as such, very new and it's unlikely they've been tweaked much for performance -- as often tweaks for the "bigger, better" card need further work for the cut down parts.

Thirdly, you're also comparing D3D performance under Windows to OpenGL performance under Mac OS X. They can differ hugely. Back before the multithreaded OpenGL stack appeared, World of Warcraft on the same machine was often showing close to double the FPS under XP compared to OS X.

Furthermore, there's already some noted discrepancies with the 8600's performance, in that the older X1600 from the 17" beats it in UT2004 on the Mac. Give the 8600 blows the X1600 away on paper, it shouldn't be doing that. Right there, that suggests that there's some driver issues.

Finally, those 3DMark scores are all from different machines, as are the iMac and MBP scores. There's always potential for a bottleneck of some kind coming from elsewhere, be it memory, hard disk, whatever.

I'll also note that it doesn't matter if the X1600 was underclocked, if the enclosure was slim, or if Apple has traditionally underclocked parts or even if Steve Jobs himelf whispers "it's underclocked by 30%!" in your ear as you sleep every night when we have the actual timings used.

We have no need to "trust you" when we can look at them, look at nVidia's recommendations which are on their website, and SEE if it's underclocked or not. The memory is underclocked by ~10%, the core is underclocked by ~1% (which is more likely just clock error from the crystal used).

You certainly don't "know" it's underclocked by 30%.

Please stop claiming the patently ludicrous.
 
We have the actual timings and it's not underclocked by anything like that.

Firstly, if you look up the 7600GT desktop part specs, you'll find it's capable or pretty similar theoretical speeds as the 8600GT.

Secondly, you're comparing 3DMark06 scores. 3DMark06 is a synthetic benchmark that attempts to calculate what sort of performance a card might have with future games by running various tests and weighting the scores from each test.

In the case of the '06 version, shader performance is quite heavily weighted compared because future games are expected to use shaders much more heavily. The 8600M GT shaders on paper should have MUCH better performance when compared to the 7600GT, so it's likely that which accounts for the 3DMark06 score.

Note that 3DMark scores in general have frequently not reflected real world performance accurately. Just recently some reviews showed the new ATI HD 2900XT card as being close to the GeForce 8800 GTS. The same reviews showed the latter would also best the former by 25-30% in some games under the right circumstances.

3DMark doesn't really tell us much beyond how well a certain card runs 3DMark.

Most current games also aren't heavily using shaders. So, a large part of the 3DMark score is coming from a part of the GPU that's not really being taxed. And if we ignore the shaders, the 7600GT in desktop form is on a par with the 8600M for raw fillrate. It should beat it at texturing by dint of more TMUs, but otherwise they're pretty much on a par.

However, that's assuming the drivers are up to scratch. The G7x arch of the 7600GT is a mature arch that is heavily derived from the previous generation, NV4x (ie 6x00 series). This is a well known and understood arch from being around so long, and the drivers are mature and tweaked for best performance. At least on Windows, the drivers also contain a variety of optimisations for specific titles in order to ensure that they perform as effectively as possible.

The G8x series is new, and the "cut-down" parts (ie not 8800 series) have literally just been launched. The drivers are, as such, very new and it's unlikely they've been tweaked much for performance -- as often tweaks for the "bigger, better" card need further work for the cut down parts.

Thirdly, you're also comparing D3D performance under Windows to OpenGL performance under Mac OS X. They can differ hugely. Back before the multithreaded OpenGL stack appeared, World of Warcraft on the same machine was often showing close to double the FPS under XP compared to OS X.

Furthermore, there's already some noted discrepancies with the 8600's performance, in that the older X1600 from the 17" beats it in UT2004 on the Mac. Give the 8600 blows the X1600 away on paper, it shouldn't be doing that. Right there, that suggests that there's some driver issues.

Finally, those 3DMark scores are all from different machines, as are the iMac and MBP scores. There's always potential for a bottleneck of some kind coming from elsewhere, be it memory, hard disk, whatever.

I'll also note that it doesn't matter if the X1600 was underclocked, if the enclosure was slim, or if Apple has traditionally underclocked parts or even if Steve Jobs himelf whispers "it's underclocked by 30%!" in your ear as you sleep every night when we have the actual timings used.

We have no need to "trust you" when we can look at them, look at nVidia's recommendations which are on their website, and SEE if it's underclocked or not. The memory is underclocked by ~10%, the core is underclocked by ~1% (which is more likely just clock error from the crystal used).

You certainly don't "know" it's underclocked by 30%.

Please stop claiming the patently ludicrous.

Did I detect a little rudeness in those last sentences or words? :mad: No need to be rude if you were. If you read all my comments you'd see I wrote "I could be wrong". Well thanks anyway for explaining all of that. It really helped. Although I still don't get how its not underclocked when Apple say its up to 57% faster than the old MBP's with x1600 when the scores are twice the numbers? Well I guess that would have to do with the drivers I guess?

Have a nice day/night.:D If you were getting rude than thats your prob but when I write my comments I mean no harm to any1 or any offense. I enjoy learning from others cuz its nice to know these things. :p and I'm just a kid so........... lol Peace n God Bless:cool:
 
Did I detect a little rudeness in those last sentences or words? :mad: No need to be rude if you were. If you read all my comments you'd see I wrote "I could be wrong". Well thanks anyway for explaining all of that. It really helped.

You asked for it. :cool:
 
ok see now I'm confused again lol. Look at this Mac forum http://forums.macnn.com/77/gaming/338096/8600m-gt-vs-x1600/ in this forum they are all saying the 8600M GT has been underclocked loads from what they've seen on barefeats. Some1 needs to set things straight lol. They talking bout how Apple always underclocks due to heat issues and thinness. Thats what I was saying at 1st too and you guys set me straight that it hasn't been underclocked. Now I'm confused again LOL. Go to the forum on the 1st page. I'm gonna keep reading to see if they changed their mind about it being underclocked a lot. lol :p
 
ok see now I'm confused again lol. Look at this Mac forum http://forums.macnn.com/77/gaming/338096/8600m-gt-vs-x1600/ in this forum they are all saying the 8600M GT has been underclocked loads from what they've seen on barefeats. Some1 needs to set things straight lol. They talking bout how Apple always underclocks due to heat issues and thinness. Thats what I was saying at 1st too and you guys set me straight that it hasn't been underclocked. Now I'm confused again LOL. Go to the forum on the 1st page. I'm gonna keep reading to see if they changed their mind about it being underclocked a lot. lol :p

They are wrong and misinformed. Again, as has been stated before, barefeats is testing in OS X.

Even barefeats showed the clockspeeds in there tests and it shows they aren't underclocked from stock speeds.

Saying that because it doesn't perform as well as they think it should in OS X must mean it is underclocked is essentially incorrect though.

Again, much more useful tests as far as gaming performance go would probably be seen in Windows with NVidia's drivers.
 
Did I detect a little rudeness in those last sentences or words? :mad:

It wasn't intended to be rude, simply strongly worded. My apologies for any offense caused. Text only mediums can suffer from the lack of voice intonation etc. when conveying meaning.

No need to be rude if you were. If you read all my comments you'd see I wrote "I could be wrong". Well thanks anyway for explaining all of that. It really helped. Although I still don't get how its not underclocked when Apple say its up to 57% faster than the old MBP's with x1600 when the scores are twice the numbers? Well I guess that would have to do with the drivers I guess?

They make those claims based on specifically benchmarked examples, and usually cite the examples. I assume it means the fastest benchmark they found in their tests was 57%... :)

As you say, drivers can play a big part...

Have a nice day/night.:D If you were getting rude than thats your prob but when I write my comments I mean no harm to any1 or any offense. I enjoy learning from others cuz its nice to know these things. :p and I'm just a kid so........... lol Peace n God Bless:cool:

Likewise :)
 
ok see now I'm confused again lol. Look at this Mac forum http://forums.macnn.com/77/gaming/338096/8600m-gt-vs-x1600/ in this forum they are all saying the 8600M GT has been underclocked loads from what they've seen on barefeats. Some1 needs to set things straight lol. They talking bout how Apple always underclocks due to heat issues and thinness. Thats what I was saying at 1st too and you guys set me straight that it hasn't been underclocked. Now I'm confused again LOL. Go to the forum on the 1st page. I'm gonna keep reading to see if they changed their mind about it being underclocked a lot. lol :p

Having read the discussion, there's so many misconceptions being bandied about that I'm not putting much stock in what they say.

The most reasonable post by far is by someone named "P".

There's a couple of things that I hadn't noticed before though. One is the BareFeats figures cames from within XP not OS X.

You're probably already aware of this, but just to get it out of the way, because it's pertinent, modern GPUs don't run at a constant clockrate but dynamically adjust their clocks on demand to reduce power usage/heat production.

This could mean there's some underclocking in the OS X drivers, but I'm not convinced this is the case -- I'll explain why. They also tested the 17" MBP, and found that under XP it too is slightly underclocked (memory), but by less than the 15" model. That suggests the max clock speed is set in the card firmware rather than the drivers, since the same driver will be used in XP.

The 17" will also actually exceed nVidia's spec under load -- spec for the core is 475MHz, the 17" hits 520MHz! Of course, the memory's still ~8% underclocked on the 17" and ~10% on the 15".

Now, while it's possible Apple set a limit in the firmware, then a further lower limit in the drivers, it doesn't really make a lot of sense to do so. Plus, I'd expect Windows to honor those limits, since that's what the firmware's supposed to do.

Personally, I suspect it's more likely to be immature drivers, something that will hopefully be fixed with an update. Here's hoping it's before Leopard.
 
Having read the discussion, there's so many misconceptions being bandied about that I'm not putting much stock in what they say.

The most reasonable post by far is by someone named "P".

There's a couple of things that I hadn't noticed before though. One is the BareFeats figures cames from within XP not OS X.

You're probably already aware of this, but just to get it out of the way, because it's pertinent, modern GPUs don't run at a constant clockrate but dynamically adjust their clocks on demand to reduce power usage/heat production.

This could mean there's some underclocking in the OS X drivers, but I'm not convinced this is the case -- I'll explain why. They also tested the 17" MBP, and found that under XP it too is slightly underclocked (memory), but by less than the 15" model. That suggests the max clock speed is set in the card firmware rather than the drivers, since the same driver will be used in XP.

The 17" will also actually exceed nVidia's spec under load -- spec for the core is 475MHz, the 17" hits 520MHz! Of course, the memory's still ~8% underclocked on the 17" and ~10% on the 15".

Now, while it's possible Apple set a limit in the firmware, then a further lower limit in the drivers, it doesn't really make a lot of sense to do so. Plus, I'd expect Windows to honor those limits, since that's what the firmware's supposed to do.

Personally, I suspect it's more likely to be immature drivers, something that will hopefully be fixed with an update. Here's hoping it's before Leopard.

Hrmm, I had assumed barefeats ran those tests with Mac games (since the three games they chose happen to be games with Mac versions available).
 
Surprised

I have to admit I'm very surprised by Barefeats' results. I was about to shell out $425 extra without a second thought before seeing this. I could care less about 3 fps...but will the gap be bigger for games (like Oblivion) with large outdoor areas, and more textures? I also plan on running it off my external montior (1920x1200, I think)...the tests still seem to show negligible results at even this resolution. Thanks for any tips!
 
There's a couple of things that I hadn't noticed before though. One is the BareFeats figures cames from within XP not OS X.

Hrmm, I had assumed barefeats ran those tests with Mac games (since the three games they chose happen to be games with Mac versions available).

The tests were done under OS X. BareFeats seems to hang out at insidemacgames.com; here's a post where he says he'd compare the 128MB and 256MB MBPs game performance under OS X (that was posted before the results were actually public): http://www.insidemacgames.com/forum/index.php?s=&showtopic=30334&view=findpost&p=314489

Pretty interesting read, with further confirmation of the huge differences in 3DMark06 scores between the two MBPs right here: http://www.insidemacgames.com/forum/index.php?s=&showtopic=30334&view=findpost&p=314431

I am really puzzled at such a discrepancy between the two models in 3DMark06, it's literally a factor of 2, and it does not get much better with AA and AF off :confused: Is the 128MB model clocked lower or something?
 
I have to admit I'm very surprised by Barefeats' results. I was about to shell out $425 extra without a second thought before seeing this. I could care less about 3 fps...but will the gap be bigger for games (like Oblivion) with large outdoor areas, and more textures? I also plan on running it off my external montior (1920x1200, I think)...the tests still seem to show negligible results at even this resolution. Thanks for any tips!

I'm waiting for some benchmarks for the performance of the two cards under Bootcamp before making a definite decision.
 
Only Noobz look at 3dmark anything scores. They dont really mean much in the real world. That said, the 8600 card are way better than the 1600. Do yourself a favor and future proof your computer,,, get the 256mb card
 
I have to admit I'm very surprised by Barefeats' results. I was about to shell out $425 extra without a second thought before seeing this. I could care less about 3 fps...but will the gap be bigger for games (like Oblivion) with large outdoor areas, and more textures? I also plan on running it off my external montior (1920x1200, I think)...the tests still seem to show negligible results at even this resolution. Thanks for any tips!

I think it would be, a game like Oblivion is going to benefit much more from extra texture memory, etc.

I'd love to hear more real-world gaming results, etc. and less benchmarks, etc.
 
Zadillo said:
Hrmm, I had assumed barefeats ran those tests with Mac games (since the three games they chose happen to be games with Mac versions available).
The tests were done under OS X.

The clock speeds weren't, and that's what I was referring to. Sorry for the lack of clarity.

barefeats.com said:
CORE CLOCK and MEMORY CLOCK SPEEDS of 8600M vs X1600
In case you weren't aware, the core and memory clock speeds of the MacBook Pro's GPU are variable, depending on what you are doing. Though there is currently no Mac OS X utility to measure this on the newest laptops, we were able to confirm the frequencies of the newest MacBook Pros using ATITool under Windows XP Pro:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.