that sucks, mine hasn't run past 50C during general unplugged use. its actually quite cool to the touch hovering around 40C most of the time, even dipping below.
in osx about the same as you
in vista its really hot
that sucks, mine hasn't run past 50C during general unplugged use. its actually quite cool to the touch hovering around 40C most of the time, even dipping below.
32 bit
i ran around oblivion for a bit...not highest settings and not AA, but close to max settings and HDR.
Forest - noticably choppy while on a horse and walking, but playable
outdoors - a little choppy while on a horse
city - no lag
With previous MacBook Pro's the X1600 was underclocked from it's intended clock speed. Has anybody with the new MBP's got a chance to check the clockspeed of the 8600M GT's? Also, is there a difference in clockspeed for the 128 and 256MB versions?
in osx about the same as you
in vista its really hot
Weeeeee! The low end model you say? With high settings and HDR? At native resolution? You kind sir (or madam) just made my day!![]()
![]()
![]()
![]()
![]()
![]()
Would it be nice and smooth if you turned down the settings a bit? Is HDR or 2xAA more demanding?
I am sure the graphics card in any Apple computer works exactly at its intended clock speed. And usually the clock speed is intended to be low to safe power when there isn't much work to do, and higher when the card is busy.
From experience HDR + AA is killer. Most of the highest end cards (8800GTX/HD2900) scoff at the pair though. AA is more hurtful if you don't have enough bandwidth for it. HDR is really computationally expensive, moreso than AA. These "D3D10" cards are more designed to handle it way better than the "D3D9" cards are.Yeah...... if that's on the model with only 128MB of VRAM, definitely nice.
Would love to hear if you get a chance to play around with settings and find what does give you smooth performance...... I imagine it is possible if settings and res are lowered a bit - and doesn't HDR also knock performance down quite a bit even on the highest end cards?
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html
this is actually a really really good score
From experience HDR + AA is killer. Most of the highest end cards (8800GTX/HD2900) scoff at the pair though. AA is more hurtful if you don't have enough bandwidth for it. HDR is really computationally expensive, moreso than AA. These "D3D10" cards are more designed to handle it way better than the "D3D9" cards are.
another thing. under vista at least this gpu shares video ram and regular ram. It says in the gpu settings that I have 887MB of total graphics memory with 128 of it dedicated. I assume it manages this number automatically.
The score is actually a little lower than a recently benchmarked laptop with the 8600M GT, but still a pretty nice score nonetheless. The difference could be due to the use of a larger TurboCache in the Asus. Here is the link: http://www.notebookreview.com/default.asp?newsID=3728&review=Asus+G1S
he also might have 256 mb of graphics memory instead of 128.
Does XP have that ability, and if it doesn't, will this significantly affect performance as opposed to running XP?
according to nTune, the core clock speed is 375mhz and the memory is 502mhz (1004 mhz).
So then will the 8600M GT, which AFAIK is a DX10 card get that design improvement that allows it to handle HDR more easily? Will this still be in effect if you're playing a DX9 game like Oblivion?
Yes.another thing. under vista at least this gpu shares video ram and regular ram. It says in the gpu settings that I have 887MB of total graphics memory with 128 of it dedicated. I assume it manages this number automatically.
Yes, XP can do turbocache.Does XP have that ability, and if it doesn't, will this significantly affect performance as opposed to running XP?
Official numbers are 475 core and 700 (1400) memory. I'm somewhat disappointed, but it's nothing overclocking can't fix![]()
LinkNewer game engines like the one in Oblivion need large amounts of frame buffer memory; when there isn't enough, performance can suffer. For the cards that have 320 MB of memory or less, the game crashes and returns to the Windows Desktop. The game uses many graphics techniques that demand hardware that can handle it. At the highest resolution, neither the 8800GTS 320 nor the 8600 based cards can render the game. However, if you could afford the 30" LCD monitor to play at that resolution, you would most likely have the money to buy a pair of high end cards to drive it in SLI.
I just installed the latest Asus drivers (straight from their website for their 8600m GT notebook model) using a modded nv_disp.inf file and according to nTune, the core clock speed is 375mhz and the memory is 502mhz (1004 mhz).
That's 100mhz below nvidia's spec of a 475mhz core clock.
he also might have 256 mb of graphics memory instead of 128.
Their drivers could be setting that clock. Until we can get good (read: official) drivers, it may be hard to get a pindown on what the true clock rate is.