Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
32 bit

i ran around oblivion for a bit...not highest settings and not AA, but close to max settings and HDR.

Forest - noticably choppy while on a horse and walking, but playable
outdoors - a little choppy while on a horse
city - no lag

Weeeeee! The low end model you say? With high settings and HDR? At native resolution? You kind sir (or madam) just made my day! :eek: :D :p :cool: :) :apple:

Would it be nice and smooth if you turned down the settings a bit? Is HDR or 2xAA more demanding?
 
With previous MacBook Pro's the X1600 was underclocked from it's intended clock speed. Has anybody with the new MBP's got a chance to check the clockspeed of the 8600M GT's? Also, is there a difference in clockspeed for the 128 and 256MB versions?

I am sure the graphics card in any Apple computer works exactly at its intended clock speed. And usually the clock speed is intended to be low to safe power when there isn't much work to do, and higher when the card is busy.
 
Weeeeee! The low end model you say? With high settings and HDR? At native resolution? You kind sir (or madam) just made my day! :eek: :D :p :cool: :) :apple:

Would it be nice and smooth if you turned down the settings a bit? Is HDR or 2xAA more demanding?

Yeah...... if that's on the model with only 128MB of VRAM, definitely nice.

Would love to hear if you get a chance to play around with settings and find what does give you smooth performance...... I imagine it is possible if settings and res are lowered a bit - and doesn't HDR also knock performance down quite a bit even on the highest end cards?
 
I am sure the graphics card in any Apple computer works exactly at its intended clock speed. And usually the clock speed is intended to be low to safe power when there isn't much work to do, and higher when the card is busy.

The X1600 in the first generation Macbook Pros were known to be underclocked compared to the 2nd generation ones (the Macbook Pros with the Core 2 Duos) at a level significantly lower than the reference level. But of course, with all modern mobile cards, there are implementations of strategies to save power such as voltage gating, etc.
 
Yeah...... if that's on the model with only 128MB of VRAM, definitely nice.

Would love to hear if you get a chance to play around with settings and find what does give you smooth performance...... I imagine it is possible if settings and res are lowered a bit - and doesn't HDR also knock performance down quite a bit even on the highest end cards?
From experience HDR + AA is killer. Most of the highest end cards (8800GTX/HD2900) scoff at the pair though. AA is more hurtful if you don't have enough bandwidth for it. HDR is really computationally expensive, moreso than AA. These "D3D10" cards are more designed to handle it way better than the "D3D9" cards are.
 
From experience HDR + AA is killer. Most of the highest end cards (8800GTX/HD2900) scoff at the pair though. AA is more hurtful if you don't have enough bandwidth for it. HDR is really computationally expensive, moreso than AA. These "D3D10" cards are more designed to handle it way better than the "D3D9" cards are.

So then will the 8600M GT, which AFAIK is a DX10 card get that design improvement that allows it to handle HDR more easily? Will this still be in effect if you're playing a DX9 game like Oblivion?
 
another thing. under vista at least this gpu shares video ram and regular ram. It says in the gpu settings that I have 887MB of total graphics memory with 128 of it dedicated. I assume it manages this number automatically.
 
another thing. under vista at least this gpu shares video ram and regular ram. It says in the gpu settings that I have 887MB of total graphics memory with 128 of it dedicated. I assume it manages this number automatically.

Does XP have that ability, and if it doesn't, will this significantly affect performance as opposed to running XP?
 
I just installed the latest Asus drivers (straight from their website for their 8600m GT notebook model) using a modded nv_disp.inf file and according to nTune, the core clock speed is 375mhz and the memory is 502mhz (1004 mhz).

That's 100mhz below nvidia's spec of a 475mhz core clock.

I also ran the World in Conflict beta, which is pretty graphically demanding. at 1440x900 it ran smooth as silk on medium settings and a little bit choppy on very high. Once they enable DX10 in the beta, the rendering should actually be a little bit quicker (according to the devs).

This was on the 2.4ghz model.
 
Does XP have that ability, and if it doesn't, will this significantly affect performance as opposed to running XP?

I think that capability is automatically built into the video card or its drivers. I mean, previous, pre-vista graphics card have had that ability too.

And based on nvidia's own presentations, its 8x00M cards are faster in 3dMark than in real life. 2x performance 3dMark05 equals 1.3x to 1.6x real life according to them. The 8600M GT is very good, but when using 3dmark alone, it's not as good as the score would lead us to believe. :(

Edit:

according to nTune, the core clock speed is 375mhz and the memory is 502mhz (1004 mhz).

Official numbers are 475 core and 700 (1400) memory. I'm somewhat disappointed, but it's nothing overclocking can't fix :)
 
So then will the 8600M GT, which AFAIK is a DX10 card get that design improvement that allows it to handle HDR more easily? Will this still be in effect if you're playing a DX9 game like Oblivion?

Use the Medium settings found here for a fantastic experience.


another thing. under vista at least this gpu shares video ram and regular ram. It says in the gpu settings that I have 887MB of total graphics memory with 128 of it dedicated. I assume it manages this number automatically.
Yes.
Does XP have that ability, and if it doesn't, will this significantly affect performance as opposed to running XP?
Yes, XP can do turbocache.
 
Official numbers are 475 core and 700 (1400) memory. I'm somewhat disappointed, but it's nothing overclocking can't fix :)


....sort of. As of right now, nTune won't accept any overclocking settings for this card. It just comes back saying that the GPU core failed at that speed. It's definitely a driver issue though, as it gives that message on any setting.
 
Found this tidbit on Tomshardware

Newer game engines like the one in Oblivion need large amounts of frame buffer memory; when there isn't enough, performance can suffer. For the cards that have 320 MB of memory or less, the game crashes and returns to the Windows Desktop. The game uses many graphics techniques that demand hardware that can handle it. At the highest resolution, neither the 8800GTS 320 nor the 8600 based cards can render the game. However, if you could afford the 30" LCD monitor to play at that resolution, you would most likely have the money to buy a pair of high end cards to drive it in SLI.
Link
 
I just installed the latest Asus drivers (straight from their website for their 8600m GT notebook model) using a modded nv_disp.inf file and according to nTune, the core clock speed is 375mhz and the memory is 502mhz (1004 mhz).

That's 100mhz below nvidia's spec of a 475mhz core clock.

Their drivers could be setting that clock. Until we can get good (read: official) drivers, it may be hard to get a pindown on what the true clock rate is.
 
he also might have 256 mb of graphics memory instead of 128.

Yeah, the Asus G1S has the GeForce 8600M GT with 256MB VRAM.

Also, given how much thicker and heavier the G1S is, I bet they didn't need to underclock the GPU as much as the MBP which could contribute to that difference.

But frankly, only a 500 point difference between the base model MBP and the G1S is really nice IMO.

I like the G1S, but it is too heavy (6.8 pounds) and too "big" overall for me. Getting somewhat close performance in a 5.4 pound 1" thick MBP is fantastic IMO.

-Zadillo
 
Their drivers could be setting that clock. Until we can get good (read: official) drivers, it may be hard to get a pindown on what the true clock rate is.

I don't think drivers can modify clockspeeds. Default clockspeeds are set in the BIOS of the video card.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.