Indeed.
Where we'll see improvement (assuming good drivers) will depend a lot on workload.
The mem bandwidth of the 8600M GT vs the Mobility X1600 using Apple stock speeds is about 70% better, and fillrate is about 120% faster, based on the Bare Feats figures and what I know of the chip itself. Shaders, OTOH, should be in the range of 3-5x faster, I think. Not sure, since the "unified" SM4.0 shaders are quite different from SM3.0...
Fillrate bound stuff could see a doubling of framerate it appears, and shader bound stuff even more.
I don't know how you calculated but from wikipedia:
http://en.wikipedia.org/wiki/GeForce_7_Series
7600 GS, 400/800MHz core/mem, 12/5 pixel/vertex shaders
7600 GT, 560/1400MHz core/mem, 12/5 pixel/vertex shaders.
From nvidia:
http://www.nvidia.com/object/go_7_series_techspecs.html
Sucked
http://www.laptopvideo2go.com/forum/index.php?showtopic=8896&pid=50215&st=20&#entry50215
7600 go, 450/800 is said to be stock for one asus laptop, another have 450/700.
http://www.nvidia.com/object/geforce_8600M.html
8600M GT, 475/1400 MHz, 32 stream processors.
Anyway, I don't know how to calculate this stuff, and since stream processors can be used for everything they will be a little more powerful, but anyway, 1 stream processors calculates one pixel at the time, but 1 shader units calculated 4 components which I think means 4 pixels? Or such? In that case the 32 stream processors will be equivalent to 8 regular shaders, and the 7600-series seems to have 17 shaders. And 17*450 > 8 * 475. But there probably is more to it =P, I know the 8600-series could do twice as menu texture whatever than the 8800-series could, whatever that means.
Someone who do understand **** like that better? Anyway I only care about FPS.
I would want to be able to play supreme commander (doubt it), crysis (maybe?) and starcraft 2, but I guess I'm screwed?
Performance in OS X on my 6800 LE is ****.