Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

OllyW

Moderator
Staff member
Oct 11, 2005
17,196
6,800
The Black Country, England
Yeap, Apple has been using the same video card in the Macbook Pros for close to 18 months. I don't care who they are targeted towards, but 18 months in the computer industry is like an era!

The MacBook Pro was announced in January 2006 and shipped in February 2006.

How on earth do you make that "close to 18 months"?

I know you like to criticise Apple, but there's no need to exaggerate :p
 

Zadillo

macrumors 68000
Jan 29, 2005
1,546
49
Baltimore, MD
Ah, thanks for the correction. It's been quite a long while since I've seen them. I'd like to get benchmarks though, as the MBP X1600 is underclocked, so a normally clocked Geforce 7400 may be comparable.

Anyone have an SZ series Vaio? :]

It isn't. The X1600 in the MBP is underclocked, but not that much (stock X1600 clock speeds are 470/470. The current MBP is clocked at 418.50/445.50).

This results though in an MBP that still performs quite well. Typical 3DMark05 score is around 3955, and 3DMark06 score is 2126.

By comparison, a typical Vaio SZ score in 3DMark05 is around 1851, and in 3DMark06 a score of about 794. This is when using the GeForce 7400 of course.The scores are lower when it is in stamina mode using the Intel GMA950.

Anyway, also just to demonstrate how the X1600 is not so "out of date", note that the Asus G1 gaming laptop, with a GeForce 7700 with 512MB of VRAM, gets a 4234 in 3DMark05, and a 2389 in 3DMark06 (these are from the notebookreview.com review).

So yes, a bit faster than a MBP, but not dramatically so........ and the G1 of course is 1.5" thick, around 7 pounds.

-Zadillo

EDIT: Don't get me wrong, the SZ is a nice little machine, and I wouldn't knock it for its GPU performance. Given the size and weight of it, it is remarkable that they even got the GeForce 7400 in it. And I think the ability to switch between dedicated and integrated graphics to improve battery life is a great idea. I wouldn't compare the SZ in terms of graphics performance to larger machines though.
 

commander.data

macrumors 65816
Nov 10, 2006
1,058
187
I'm also of the opinion that the MR X1600 is not out of date. But even if it was, what could Apple do? The limited thermal budget prevents them from using high-end GPUs (at least they won't work on the 15.4" model), so they are really stuck with mid-range parts. The current choices are the MR X1600, the MR X1700, the Go 7600, and the Go 7700. We are using the 1 out of those 4 already, and the Go 7600 performs the same or lower than the MR X1600. (But that becomes kind of a ATi vs nVidia debate).

The MR X1700 is identical to the MR X1600 only that it is produced using a 90nm strained silicon process so in theory it uses a little less power and can be clocked a bit higher. In actuality, the MR X1600 can perform exactly the same as the MR X1700 at the same clock speeds, only it may consume a bit more power. Certainly the MR X1700 doesn't have enough to put the MR X1600 out of date. Seeing that the MR X1700 hasn't really replaced the MR X1600, I'm betting there are production or price concerns with the strained silicon process which makes the MR X1700 prohibitive. ATI's drivers also don't seem to have universal support for the MR X1700 yet.

That really only leaves the Go 7700 and yes I admit that the Go 7700 would be faster in basically all cases since it's a 12 pipeline design versus the MR X1600's 4 pipeline, 12 pixel shader layout. Still, as Zadillo mentions, the difference is hardly earth-shattering. The Go 7700 may have a lot of hardware, but those 12 pipelines don't perform at their full potential since they don't have sufficient memory bandwidth feeding them. So the difference between the Go 7700 and the MR X1600 is no where near the 7600GT's advantage over the desktop X1600XT. The Go 7700 does use the 80nm process, but that's a cost half node so it wasn't optimized for power savings. In any case, the Go 7700 does seem pretty power efficient considering the hardware involved, but it'd probably still use more than the MR X1600. I don't think the performance difference is enough to condemn Apple for not going with the Go 7700 in the C2D refresh, especially if ATI was offering Apple a good discount to continue using the MR X1600.

In terms of the lifespan of the MR X1600, it has been around for a while, but the only reason is that in comparison to the competition, it came out early. The MR X1600 was really launched in December 2005 and Apple was one of the first manufacturers to integrate it. The Go 7600 wasn't launched until a few months later, so that's why it seems newer.

The whole mobile graphics segment as been pretty stagnant the last year anyways since everyone's devoting resources to the DX10 transition. ATI in particular doesn't seem to be paying much attention since the MR X1700 was such a disappointment. However, there original schedule called for DX10 GPUs in Q4 2006, so they probably weren't planning to launch a mobile refresh anyways. But, with the constant delays they were forced to put something out, which became the MR X1700. ATI looks to be slipping even further since the R600 is now confirmed to be delayed until May. It's also not helping matters that ATI is trying to produce their mainstream parts on the brand new 65nm process. The new process will make for great parts, but it's a matter of how long we'll have to wait. nVidia is going the opposite direction and making their mainstream parts in the 80nm process. With the 80nm process being a cost half-node, nVidia's mainstream mobile processors could well be power hungry and run hot, especially if they try putting a 256-bit memory controller in it (like the desktop 8600 Ultra is supposed to have). The advantage though is that they should be able to launch a few months ahead of ATI, probably still not until early Q2 though.
 

whateverandever

macrumors 6502a
Nov 8, 2006
778
8
Baltimore
It isn't. The X1600 in the MBP is underclocked, but not that much (stock X1600 clock speeds are 470/470. The current MBP is clocked at 418.50/445.50).

Core Duo machines have their X1600 clocked at 310/278, which falls extremely short of the 470/470 recommended stock speeds. While it's true that you can reboot and overclock it back up in Windows, it makes for poor gaming performance in Mac OS :| Not to mention that I've only really had it overclocked stably to 430/430 in Windows... might just be my machine, but I can't even reach 470/470.
 

lavrishevo

macrumors 68000
Jan 9, 2007
1,864
204
NJ
Core Duo machines have their X1600 clocked at 310/278, which falls extremely short of the 470/470 recommended stock speeds. While it's true that you can reboot and overclock it back up in Windows, it makes for poor gaming performance in Mac OS :| Not to mention that I've only really had it overclocked stably to 430/430 in Windows... might just be my machine, but I can't even reach 470/470.

Yeah I know what you mean, I can run at about 450/475 with no problems but much higher and there is issues. I think if anything this is due to the out of date video card driver. Don't know why they have not releaser a new one.

DW
 

commander.data

macrumors 65816
Nov 10, 2006
1,058
187
Core Duo machines have their X1600 clocked at 310/278, which falls extremely short of the 470/470 recommended stock speeds. While it's true that you can reboot and overclock it back up in Windows, it makes for poor gaming performance in Mac OS :| Not to mention that I've only really had it overclocked stably to 430/430 in Windows... might just be my machine, but I can't even reach 470/470.
I don't think ATI actually has recommended clock speeds for their mobile GPUs. Notebook manufacturers set the speeds according to the available thermal envelope and targeted battery life. I think that the 470/470 number may be the max recommending clock speeds that ATI supports notebook manufacturers use without voiding the support policy. A lot of notebooks couple the MR X1600 with DDR2 memory anyways instead of GDDR3 which limits the memory to 400MHz and they clock the core near 470MHz to compensate, but it really isn't that effective since the core would be bandwidth starved anyways.

Apple seems to be using good GDDR3 and from the pictures I've seem it looks to be Samsung stuff rated at up to 700MHz like that used for the desktop X1950Pro. It guess they feel 700MHz memory downclocked to 450MHz is more energy and heat efficient than 500MHz memory running at 450MHz which would be more standard practice. The MR X1600 core may not be the highest clocked at 425MHz, but at least with the memory being clocked high relative to the core at 450MHz, the core is less likely to be memory bandwidth starved and is working as efficiently as possible.
 

Abstract

macrumors Penryn
Dec 27, 2002
24,889
921
Location Location Location
I don't think ATI actually has recommended clock speeds for their mobile GPUs. Notebook manufacturers set the speeds according to the available thermal envelope and targeted battery life. I think that the 470/470 number may be the max recommending clock speeds that ATI supports notebook manufacturers use without voiding the support policy. A lot of notebooks couple the MR X1600 with DDR2 memory anyways instead of GDDR3 which limits the memory to 400MHz and they clock the core near 470MHz to compensate, but it really isn't that effective since the core would be bandwidth starved anyways.

Apple seems to be using good GDDR3 and from the pictures I've seem it looks to be Samsung stuff rated at up to 700MHz like that used for the desktop X1950Pro. It guess they feel 700MHz memory downclocked to 450MHz is more energy and heat efficient than 500MHz memory running at 450MHz which would be more standard practice. The MR X1600 core may not be the highest clocked at 425MHz, but at least with the memory being clocked high relative to the core at 450MHz, the core is less likely to be memory bandwidth starved and is working as efficiently as possible.

Thanks. You've been really informative with your posts so far. :)
 

Zadillo

macrumors 68000
Jan 29, 2005
1,546
49
Baltimore, MD
Core Duo machines have their X1600 clocked at 310/278, which falls extremely short of the 470/470 recommended stock speeds. While it's true that you can reboot and overclock it back up in Windows, it makes for poor gaming performance in Mac OS :| Not to mention that I've only really had it overclocked stably to 430/430 in Windows... might just be my machine, but I can't even reach 470/470.

I'm talking about Core 2 Duo machines. Apple did improve the clockspeeds with those to what I said before (418/445), which is quite good actually (Apple isn't the only company that underclocks the mobile GPU's to some degree or other).

I have to think a large part of this is the improved vents on the back of the C2D MBP's.
 

whateverandever

macrumors 6502a
Nov 8, 2006
778
8
Baltimore
I'm talking about Core 2 Duo machines. Apple did improve the clockspeeds with those to what I said before (418/445), which is quite good actually (Apple isn't the only company that underclocks the mobile GPU's to some degree or other).

I have to think a large part of this is the improved vents on the back of the C2D MBP's.

Weird, that's the first I've heard of improved vents on the back of C2D machines. Where did you hear that?
 

Zadillo

macrumors 68000
Jan 29, 2005
1,546
49
Baltimore, MD
Weird, that's the first I've heard of improved vents on the back of C2D machines. Where did you hear that?

I didn't "hear" it from anywhere. It's been known since the C2D MBP's came out. The vents on the back of the C2D MBP are much wider (basically, like 6 long vents instead of a bunch of tiny ones). You can see this if you look at the back of a CD MBP and then look at the back of a C2D MBP.

I recall people here had taken some comparison photos too when the C2D MBP's launched.

-Zadillo
 

whateverandever

macrumors 6502a
Nov 8, 2006
778
8
Baltimore
I didn't "hear" it from anywhere. It's been known since the C2D MBP's came out. The vents on the back of the C2D MBP are much wider (basically, like 6 long vents instead of a bunch of tiny ones). You can see this if you look at the back of a CD MBP and then look at the back of a C2D MBP.

I recall people here had taken some comparison photos too when the C2D MBP's launched.

-Zadillo

That's interesting. I couldn't find any pics, but it's nice to know. If I ever have a windfall of cash it may be worth it to upgrade to the better-clocked video card, haha.
 

sycho

macrumors 6502a
Oct 7, 2006
865
4
I really wouldn't be surprised if they did, especially now with ATI being owned by AMD.

Yeah, because that will make a difference Apple's choice for a GPU.
Think about it, did Microsoft not use G5's for the alpha development kit for the 360?
 

Zadillo

macrumors 68000
Jan 29, 2005
1,546
49
Baltimore, MD
Yeah, because that will make a difference Apple's choice for a GPU.
Think about it, did Microsoft not use G5's for the alpha development kit for the 360?

No, but it's not necessarily the same scenario. Not saying that it's a reason to no longer use ATI, but being part of AMD now, Apple might perhaps might want to not give extra business to Intel's rival. Of course, they could just as easily keep business going with ATI as a way to keep Intel on their toes and make sure they maintain the relationship.

Microsoft used G5's for the alpha development kit for the 360 because the final XBox 360 uses PowerPC chips. There isn't too much more to read into it really.
 

sycho

macrumors 6502a
Oct 7, 2006
865
4
Microsoft used G5's for the alpha development kit for the 360 because the final XBox 360 uses PowerPC chips. There isn't too much more to read into it really.

To say that Apple should use nVidia because AMD owns ATi makes sence though...:rolleyes:
What about every other laptop that was an Intel Processor, should those use nVidia cards instead of ATi's? Get real, Apple can use which ever GPU they want.
 

bob marley

macrumors newbie
Feb 26, 2007
23
0
Seriously, its nice to put a couple games on normal settings and play for the fun of it on like a 128 mb graphics card that can handle most games at moderate standards, but SERIOUSLY, do you really need dual 768 mb CrossFire ATI video cards in your computer or nVidia SLI 1gb graphics memory in your notebook when like 512 or 256 is good enough for games at high standards
 

Zadillo

macrumors 68000
Jan 29, 2005
1,546
49
Baltimore, MD
To say that Apple should use nVidia because AMD owns ATi makes sence though...:rolleyes:
What about every other laptop that was an Intel Processor, should those use nVidia cards instead of ATi's? Get real, Apple can use which ever GPU they want.

Calm down. All I said is that it wouldn't surprise me if they did switch to using primarily NVidia GPU's. I didn't say that they should or would.

Aside from that though, there may be more practical reasons for Apple to use more NVidia stuff (i.e. if NVidia makes more advances on the mobile GPU front than ATI, which does look to be the case right now, although we'll know more in the next couple of months).

-Zadillo
 

Zadillo

macrumors 68000
Jan 29, 2005
1,546
49
Baltimore, MD
Apple should add ATI Radeon X1800 XT into their next MacBook Pro and nVidea 6600 in the MacBook.

It would be impossible to put the X1800XT into a 1" thick MBP.

The only hope is that the next generation of mobile GPU's is more powerful yet more efficient.

Anything beyond integrated graphics at the MacBook's price point is probably wishful thinking at this point. At best maybe something low-end.
 

eXan

macrumors 601
Jan 10, 2005
4,738
134
Russia
Seriously, its nice to put a couple games on normal settings and play for the fun of it on like a 128 mb graphics card that can handle most games at moderate standards, but SERIOUSLY, do you really need dual 768 mb CrossFire ATI video cards in your computer or nVidia SLI 1gb graphics memory in your notebook when like 512 or 256 is good enough for games at high standards

The VRAM is the last thing to worry about in a graphics card.

Its the core/VRAM clock speed, number of pixel pipes, vertex shader engines, memory bandwidth, etc that matters.

I find it strange when people compare graphics cards' performance by the amount of memory those cards have :rolleyes:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.