Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As for what people are seeing the GT130 report as, I am still wondering if we are getting table guesses? Has anyone run the RIVA tuner program against it? It usually is very good at identifying what the hardware does versus what a table think it does.

I ran Riva Tuner on it a few days ago, results are posted somewhere in the middle of this thread:

https://forums.macrumors.com/threads/662789/

Sadly, I don't think it was that helpful.
 
GT 130 not even close to Mobile Radeon 3850

Okay, the GT 130M has just 32 processor cores, which puts it on par with the previous 9600M and 8600M series.

GT 130M:
http://www.nvidia.com/object/product_geforce_gt_130m_us.html

9600M:
http://www.nvidia.com/object/product_geforce_9600m_gt_us.html

This compares with 112 processors for the top of the line mobile Nvidia 9800M GTX

Remember, ATI and Nvidia processors are not equal, with ATI requiring 800 processors in the 4870 to nearly match the 240 processors in the Desktop GT 280.
That being said, 800 ATI processors are much more than a match for 32 Nvidia processors.

For graphic accelerated applications outside of games, the 4850 may not make a big difference today. But when Snow Leopard ships with Open CL, it most certainly will.

http://www.khronos.org/opencl/
 
I'm sorry but you are wrong. It's the 9800M GTS, not GS. The GTS is more powerful. (Recheck the thread yourself)

No, I'm sure that it is NOT the 9400M GTS, because the clock rates on the 9400M GS match up almost exactly with the info provided in the thread by Riva Tuner and the nVidia Drivers.

From that thread (emphasis added):
Finally got Vista Home Premium patched, tweaked and flying (boot camp). It lists the card as a 9800 variant.


Processor: Intel(R) Core(TM)2 Duo CPU E8435 @ 3.06GHz (3051 MHz)
Operating System: Windows Vista (TM) Home Premium, 32-bit (Service Pack 1)
DirectX version: 10.0
GPU processor: GeForce 9800 X
Driver version: 178.46
Stream processors: 64
Core clock: 529 MHz
Shader clock: 1323 MHz
Memory clock: 792 MHz (1584 MHz data rate)
Memory interface: 256-bit
Total available graphics memory: 1779 MB
Dedicated video memory: 512 MB
System video memory: 0 MB
Shared system memory: 1267 MB
Video BIOS version: 62.94.74.00.05
IRQ: 23
Bus: PCI Express x16 Gen2

AND also (add the two domains for the total, it adds up to within 30MHz):

$ffffffffff ----------------------------------------------------------------
$ffffffffff NVIDIA specific display adapter information
$ffffffffff ----------------------------------------------------------------
$0100000000 Graphics core : G94 revision A1 (64sp)
$0100000001 Hardwired ID : 062e (ROM strapped to 062e)
$0100000002 Memory bus : 256-bit
$0100000003 Memory type : DDR3 (RAM configuration 00)
$0100000004 Memory amount : 524288KB
$0100000100 Core clock domain 0 : 168.750MHz
$0100000101 Core clock domain 1 : 337.500MHz

$0100000006 Memory clock : 100.000MHz (200.000MHz effective)
$0100000007 Reference clock : 25.000MHz/27.000MHz

Now look at this chart:

Picture+35.png
 
How is it possible that it's still not entirely certain what kind of card GT130 in iMac is?

From what I've read on this forum, it seems it's rather close match to Mobility HD4850. If GT1300 can get 9.5k points in 3d Mark, and Mobility Radeon gets almost exactly the same, it wouldn't make sense to offer it as an alternative (or charge extra 40quid for it!). So I'd rather venture my guess that's it's going to be the desktop version.
 
How is it possible that it's still not entirely certain what kind of card GT130 in iMac is?

From what I've read on this forum, it seems it's rather close match to Mobility HD4850. If GT1300 can get 9.5k points in 3d Mark, and Mobility Radeon gets almost exactly the same, it wouldn't make sense to offer it as an alternative (or charge extra 40quid for it!). So I'd rather venture my guess that's it's going to be the desktop version.

Actually the mobility Radeon 4850 gets something like 15K in that same benchmark, so it will probably be 1.5X faster on average in many 3D applications including games.
 
Okay, the GT 130M has just 32 processor cores, which puts it on par with the previous 9600M and 8600M series.

GT 130M:
http://www.nvidia.com/object/product_geforce_gt_130m_us.html

9600M:
http://www.nvidia.com/object/product_geforce_9600m_gt_us.html

This compares with 112 processors for the top of the line mobile Nvidia 9800M GTX

Remember, ATI and Nvidia processors are not equal, with ATI requiring 800 processors in the 4870 to nearly match the 240 processors in the Desktop GT 280.
That being said, 800 ATI processors are much more than a match for 32 Nvidia processors.

For graphic accelerated applications outside of games, the 4850 may not make a big difference today. But when Snow Leopard ships with Open CL, it most certainly will.

http://www.khronos.org/opencl/

It's the GT 130 NOT the GT 130M

Actually the mobility Radeon 4850 gets something like 15K in that same benchmark, so it will probably be 1.5X faster on average in many 3D applications including games.


(Compare with 9800M GTS)
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html

No, I'm sure that it is NOT the 9400M GTS, because the clock rates on the 9400M GS match up almost exactly with the info provided in the thread by Riva Tuner and the nVidia Drivers.

From that thread (emphasis added):


AND also (add the two domains for the total, it adds up to within 30MHz):



Now look at this chart:

Picture+35.png

It's the 9800M GTS,

https://forums.macrumors.com/posts/7229874/
 
Erm, not really...


No it isn't. The two specs you compare in the post you linked to are both the specs for the 9800M GTS. That is why they are exactly the same. Because they are taken from the exact same source for the exact same video card! Take a look at the posts you quoted a little more carefully. The post (the one from itommyboy) you quoted as being the specs for the GT 130 was actually notebookcheck's specs on the 9800M GTS.

Here is the post you linked to (my comments are bolded):
It really looks like the 9800M GTS

Look at the specs of the GT 130 here: (No those are the specs from notebookcheck for the 9800M GTS! If you want proof, look at the link itommyboy put at the end of his post to nVidia's site advertising the 9800M GTS! itommyboy posted those specs for comparison because Eidorian was insunating that it was the 9800M GTS due to what GPU-Z returned.)

https://forums.macrumors.com/posts/7227477/



and the ones of the 9800M GTS here:

http://www.notebookcheck.net/NVIDIA-GeForce-9800M-GTS.9918.0.html



They're exactly the same! (That is because they are both specs for the 9800M GTS! Niether of them are specs for the GT 130.)
 
No it isn't. The two specs you compare in the post you linked to are both the specs for the 9800M GTS. That is why they are exactly the same. Because they are taken from the exact same source for the exact same video card! Take a look at the posts you quoted a little more carefully. The post (the one from itommyboy) you quoted as being the specs for the GT 130 was actually notebookcheck's specs on the 9800M GTS.

Here is the post you linked to (my comments are bolded):

https://forums.macrumors.com/showthread.php?p=7227477#post7227477

Scroll up a couple of posts..... You need to view the entire thread to understand where he's getting the info from.
 
No it isn't. The two specs you compare in the post you linked to are both the specs for the 9800M GTS. That is why they are exactly the same. Because they are taken from the exact same source for the exact same video card! Take a look at the posts you quoted a little more carefully. The post (the one from itommyboy) you quoted as being the specs for the GT 130 was actually notebookcheck's specs on the 9800M GTS.

Here is the post you linked to (my comments are bolded):

My bad, I quoted the wrong post, but the GPU Z of the GT 130 gives the same results (as you said) as the 9800M GTS:

https://forums.macrumors.com/posts/7227410/
 
My bad, I quoted the wrong post, but the GPU Z of the GT 130 gives the same results (as you said) as the 9800M GTS:

https://forums.macrumors.com/posts/7227410/

That is true, but the nVidia Drivers and Riva Tuner indicate that it is the 9800 GS. I think the reason that GPU-Z gives a clock of 600 is because it is database-based. GPU-Z claims there are 0 pixel shaders on 0 vertex shaders!

EDIT: Fixed an error.
 
That is true, but the nVidia Drivers and Riva Tuner indicate that it is the 9400 GS. I think the reason that GPU-Z gives a clock of 600 is because it is database-based.

I think it is detecting the 9400 in the base chipset.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.