Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
another thing. under vista at least this gpu shares video ram and regular ram. It says in the gpu settings that I have 887MB of total graphics memory with 128 of it dedicated. I assume it manages this number automatically.

Makes sense.... I recall that Asus I think advertised the 8600M GT in their G1S as having 256MB VRAM up to 1024MB (including shared memory).

I've read some vague things that the video memory sharing stuff is supposed to be improved overall in the 8M series.
 
I am sure the graphics card in any Apple computer works exactly at its intended clock speed. And usually the clock speed is intended to be low to safe power when there isn't much work to do, and higher when the card is busy.

You're quite right. The GPUs aren't underclocked, but run to Apple's specs. If you increase the speed to match some other manufacturer, you are exceeding the specs and therefore overclocking.
 
You're quite right. The GPUs aren't underclocked, but run to Apple's specs. If you increase the speed to match some other manufacturer, you are exceeding the specs and therefore overclocking.

But nvidia has reference levels that say what levels their video card is supposed to run at. If the 8600M GT in the MBP has a clockspeed lower than these levels, it is underclocked, presumably because of heat issues. Conversely, if the clockspeed is higher than these reference levels, then it is overclocking and some manufacturers of video cards such as BFG, eVGA, etc. actually sell their video cards this way.
 
I don't think drivers can modify clockspeeds. Default clockspeeds are set in the BIOS of the video card.

Hmm, but if these cards don't have a BIOS (I thought they were flashed with EFI ROMS) then how is the reference level set? If the cards have BIOS you should be able to read it and back it up with a flashing program. I will check and see if NVflash works on the 8x00 series cards.
 
Okay, yes, if the card has BOIS ROM it can be flashed. Using NiBiTor and nvflash v5.50, you can have a 8600M GT from any other manufaturer. Or you can read your settings and go from there. But this would only work if there was BIOS on the card. There was a GUI tool, I used it to set a permanent overclock in my 6800, but I don't remember what it was called. And I am pretty sure I used nvflash to upload it anyways. I imagine it may not work though, as I am not sure how the EFI portion is allocated versus the BIOS.
 
Okay, yes, if the card has BOIS ROM it can be flashed. Using NiBiTor and nvflash v5.50, you can have a 8600M GT from any other manufaturer. Or you can read your settings and go from there. But this would only work if there was BIOS on the card. There was a GUI tool, I used it to set a permanent overclock in my 6800, but I don't remember what it was called. And I am pretty sure I used nvflash to upload it anyways. I imagine it may not work though, as I am not sure how the EFI portion is allocated versus the BIOS.
Rivatuner?
 
Okay, yes, if the card has BOIS ROM it can be flashed. Using NiBiTor and nvflash v5.50, you can have a 8600M GT from any other manufaturer. Or you can read your settings and go from there. But this would only work if there was BIOS on the card. There was a GUI tool, I used it to set a permanent overclock in my 6800, but I don't remember what it was called. And I am pretty sure I used nvflash to upload it anyways. I imagine it may not work though, as I am not sure how the EFI portion is allocated versus the BIOS.
You should be able to flash the video BIOS as long s you can boot into Windows mode. Then again if the card will still work in OS X is a whole different story.
 
I just installed the latest Asus drivers (straight from their website for their 8600m GT notebook model) using a modded nv_disp.inf file and according to nTune, the core clock speed is 375mhz and the memory is 502mhz (1004 mhz).

That's 100mhz below nvidia's spec of a 475mhz core clock.

I also ran the World in Conflict beta, which is pretty graphically demanding. at 1440x900 it ran smooth as silk on medium settings and a little bit choppy on very high. Once they enable DX10 in the beta, the rendering should actually be a little bit quicker (according to the devs).

This was on the 2.4ghz model.
I doubt memory is 1004MHz, since normal is 700... so it's more likely 502 vs 700 :(, no chance Apple did OVERCLOCK it.
Core clock normal is 475MHz as you say, so probably 100MHz core underclock and 200 (198) MHz memory underclock.
That sucks.

How to overclock it in OS X?

Edit: ok, maybe 700*2.
 
I don't think drivers can modify clockspeeds. Default clockspeeds are set in the BIOS of the video card.
You have always been able to turn on a bit in the register which let nvidias own drivers set clock speed, so no.. it can be done with drivers. Rivatuner can do it instantly aswell, thought if you wanna keep them forever it's easier to just flash them into the firmware.
 
I doubt memory is 1004MHz, since normal is 700... so it's more likely 502 vs 700 :(, no chance Apple did OVERCLOCK it.
Core clock normal is 475MHz as you say, so probably 100MHz core underclock and 200 (198) MHz memory underclock.
That sucks.

How to overclock it in OS X?

Since it's DDR (Double Data Rate) memory (GDDR3 to be specific), 1000MHz is the equivalent frequency if it was SDRAM. So it's 500MHz X 2.
 
You have always been able to turn on a bit in the register which let nvidias own drivers set clock speed, so no.. it can be done with drivers. Rivatuner can do it instantly aswell, thought if you wanna keep them forever it's easier to just flash them into the firmware.

You're referring to coolbits, and it has to be enabled in the registry manually. Drivers installed on a clean Windows install will not modify frequency.
 
You're referring to coolbits, and it has to be enabled in the registry manually. Drivers installed on a clean Windows install will not modify frequency.

If the drivers think the card is something it isn't they can, that is how people used to make their 9500 Pro's full fledged 9700 Pro's back in the day. That is also why we have to do inf hacks to get the driver to recognize the card.
 
Rivatuner?

No I actually flashed my BIOS with a GUI based tool, or at least changed the settings saved it the used nvflash to upload it. You can put my card (currently used in server) in any computer and it runs at 399/829 permanently. I keep meaning to replace the card in that system, but am kinda lazy (hey it works, leave me alone...).

Although Rivatuner should be able to tell what the card really is regardless of what the driver or Windows thinks.
 
If the drivers think the card is something it isn't they can, that is how people used to make their 9500 Pro's full fledged 9700 Pro's back in the day. That is also why we have to do inf hacks to get the driver to recognize the card.

That was a very special situation where a soft-hack was used because some 9500 Non Pros were actually 9700 with the half the pipelines and memory bus disabled (but not removed), so in that special case, it worked. By the way, this hack did not work with the 9500 PRO. These kinds of hacks don't usually work anymore for newer cards since they got smarter about this kind of thing.
 
You're referring to coolbits, and it has to be enabled in the registry manually. Drivers installed on a clean Windows install will not modify frequency.
Still obviously there are no problem to do it with drivers.
 
No I actually flashed my BIOS with a GUI based tool, or at least changed the settings saved it the used nvflash to upload it. You can put my card (currently used in server) in any computer and it runs at 399/829 permanently. I keep meaning to replace the card in that system, but am kinda lazy (hey it works, leave me alone...).

Although Rivatuner should be able to tell what the card really is regardless of what the driver or Windows thinks.
Yeah but you use rivatuner to download your firmware, adjust it, save it, and then upload it to the card with nvflash again.
 
Now when we have an updated bootcamp ...

... can someone confirm that they remain the same?

I guess noone have tried changing them in rivatuner either without rebooting or thru a flash of the graphics?
 
Now that I've got the proper drivers installed with Boot Camp 1.3, I can get a better read on the clock speeds.

Looks like the 8600M GT is running at

2d:
283 MHZ Core
297 MHZ Mem

3d:
470 MHZ Core
635 MHZ Mem


So it is running about 1% slower on the core and 10% slower on the memory.
 
Now that I've got the proper drivers installed with Boot Camp 1.3, I can get a better read on the clock speeds.

Looks like the 8600M GT is running at

2d:
283 MHZ Core
297 MHZ Mem

3d:
470 MHZ Core
635 MHZ Mem


So it is running about 1% slower on the core and 10% slower on the memory.
Is that during some sort of load or so? Because someone made a new thread with that 375/502-settings again, but maybe he used Asus drivers again, but if not maybe it's the GPU changing it's frequency and rivatuner reading the current speed? So maybe it would run at full speed while running 3dmark06 and throttle down when idling?

Do you get the same speed if doing nothing vs putting it on load?
 
That was a very special situation where a soft-hack was used because some 9500 Non Pros were actually 9700 with the half the pipelines and memory bus disabled (but not removed), so in that special case, it worked. By the way, this hack did not work with the 9500 PRO. These kinds of hacks don't usually work anymore for newer cards since they got smarter about this kind of thing.

that was a shame wasnt it, i have a 9500pro, probably one of the best cards i ever did buy. full R300 processor in that bad boy

they made some 9500NP's, they disabled half the R300 but bizzarley put the chip on a 9700 PCB so it had the 256bit memory bus. all you needed to do was unlock the diabled bit and wa la 9700Non-pro.... use ati tool or whatever and wa la 9700PRO!

you could tell by the RAM arrangement. if it had ram in a L - shaped arrangment = 256bit, if it was striaght down the outside edge in a I-shape = 128bit :( the 9500pros all came with I-shaped ram. :(

you could also do the same with X800's and 6800's

so many people i know bought X800GTO's or X800pro's and ended up with X800XT PE's! ... fun times. couple people did it with the 6800NU and lesser cards and ended up with 6800GT/Ultras

was all luck of the draw, but in the end sapphire were selling X800GTO^2 with either unlocked or guarranteed to unlock.

those were the days
 
Now that I've got the proper drivers installed with Boot Camp 1.3, I can get a better read on the clock speeds.

Looks like the 8600M GT is running at

2d:
283 MHZ Core
297 MHZ Mem

3d:
470 MHZ Core
635 MHZ Mem


So it is running about 1% slower on the core and 10% slower on the memory.

Hmm. I installed the Boot Camp 1.3 drivers. I'm using nTUNE from nVidia to check the clock speeds. It displays them dynamically under Windows XP Pro. I'm getting 375 core clock speed and 502 memory clock speed (x2) no matter whether running 2D or 3D apps.

I know the previous model of MBP would upclock the core of the Mobility Radeon X1600 from 311MHz to 472MHz when I ran Doom 3 or Quake 4. I was hoping the GeForce 8600M GT is doing the same thing --- at least under OS X. But there is no way to check it under OS X at this point since apps like ATIccelerator and Graphiccelerator won't work with it.

Whether that's true or not, the 8600M is running Quake 4 60% faster than the X1600 at 1280x800 High.
 
Hmm. I installed the Boot Camp 1.3 drivers. I'm using nTUNE from nVidia to check the clock speeds. It displays them dynamically under Windows XP Pro. I'm getting 375 core clock speed and 502 memory clock speed (x2) no matter whether running 2D or 3D apps.

I know the previous model of MBP would upclock the core of the Mobility Radeon X1600 from 311MHz to 472MHz when I ran Doom 3 or Quake 4. I was hoping the GeForce 8600M GT is doing the same thing --- at least under OS X. But there is no way to check it under OS X at this point since apps like ATIccelerator and Graphiccelerator won't work with it.

Whether that's true or not, the 8600M is running Quake 4 60% faster than the X1600 at 1280x800 High.

The interesting part is, just because you are getting 375mhz as the core clock that doesn't mean the Shader Engine is running at that speed. The SE is double pumped. So you are looking at 750mhz. Which isn't too shabby. ATI doesn't do that in their D3D10 parts. So the spped they advertise is the speed their SE runs at. Nvidia has been running the SE at higher clocks since the 7 series.
 
You're quite right. The GPUs aren't underclocked, but run to Apple's specs. If you increase the speed to match some other manufacturer, you are exceeding the specs and therefore overclocking.

With underclocking me and the rest of this forum mean that the clockspeed is lower than that advised (published) by the manufacturer of the GPU (in this case nvidia). So if this GPU is indeed clocked 100Mhz below the advised clockspeed than it -of course- is underlocked.

That Apple does that to make sure the MBP's don't run too hot and that this lower clock speeds is 'Apple's specs' does not mean it is not underclocked... :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.