Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Woot!

After almost 24 hours without sleep, I finally got my 9800gx2 fully functional (as my default display and everything).

System:
Apple MacPRO (2x 2.8Ghz quad-core) (NOT a hackintosh)
6GB RAM
NVidia 8800GT (Apple upgrade kit)
NVidia 9800GX2 (Retail one. Bought eVGA from newegg)

Tomorrow I will post more details and pics of the 9800GX2 working under MacOSX 10.5.5 on my MacPro.

One last note. I F***ing hate Apple for this. If it took me only 24 hours wihtout sleep to make this work. Why the hell they don't make it available for everybody. I paid a LOT of money for this computer. Plus another big one for the two 9800GX2. PLUS some more for the 8800GT upgrade kit, just so I could use 8800GT EFI-enable firmware to "validate" the 9800's for MacOSX.

Next time I will only get all this money and create a customized hackintosh and save all the headache......

Interesting, be sure to post those pictures and perhaps a few benchmarks :)
 
After some reading, I saw that little to no applications/games benefits from quad-sli (with two 9800gx2). So i decided to screw with the two 9800gx2 and stick just one 9800gx2 and one 8800gt( so I could boot OSX).

SLI does not work on Mac Pro anyway.... not in windows, not in osx, not ever...
 
Thats not true. As you can see on the deviceQuery above, MacOSX recognized both cores. They will just not be used in "SLI". Which I dont care, since like 90% of the stuff I use gets no benefit from SLI.

In fact, CUDA applications works better if you disable SLI, even under windows. So your applications can manage both cores by itself.

Also, a single 9800GX2 is already faster than a 8800GT for gaming.

So, in the end, it's a win/win trade.
 
I have a few questions.

Is there any application to monitor the GPU usage.

I have a feeling that 9800GX2 will make use of both cores, since even under windows you see two separete cards. And you don't have the option to enable/disable SLI when you use a single 9800GX2 (2cores). But then when you run applications, the card will use both cores. Also, it's been proven that you can run SLI on virtually any chipset, if you make the driver skip the SLI chipset verification. I think it worth a try.

Another question, what benchmark software do you recomend, so I can post some benchs for the setup I am using.
 
Dude, pics & howto plz, nice job btw :cool:

Dunno if this will help but there are three programs
that might help, OpenGL Profiler, Open GL Extensions Viewer
& OpenGL Driver Monitor. Should be apart of XCode.
 
For those interested in try this before I have time to post a full Howto.

The final setup was:

Slot-1 ----9800GX2----
Slot-2 ----8800GT-----

You also need to add the strings for the 9800GX2 on the GeForce, NVResman and NVDANV50Hal extensions.

You also need 2 monitors. One connected on the DVI port 2 of the 8800GT and one on the DVI port 1 of the 9800gx2.

On the final step to make it really work, I had to use NVinject 0.2.2 modified to make the OS not only see 9800gx2 as a valid card(the 8800gt EFI firmware makes the system see all other nvidia cards as valid) but to make the OS actualy load the drivers.

The kexts for OSX 10.5.5 are here:

http://www.unsekure.net/mac9800gx2/9800gx2kexts.tgz

While the system is booting, the gray apple will show on the 8800gt display. Right before the system load the login screen, the display with be auto-switched to the 9800gx2 display (NVinject gets in action and makes the OS recognize the 9800gx2).

In the end, because of NVinject, the system will see all nvidia cards as a 9800gx2, as you can see on the pic below. I still need to fix that.

9800gx2working.png


However, I think it's just a cosmetic thing. Cause in the end, CUDA can see the right cards:

Code:
There are 3 devices supporting CUDA

Device 0: "G92-450"
  Major revision number:                         1
  Minor revision number:                         1
  Total amount of global memory:                 536674304 bytes
  Number of multiprocessors:                     16
  Number of cores:                               128
  Total amount of constant memory:               65536 bytes
  Total amount of shared memory per block:       16384 bytes
  Total number of registers available per block: 8192
  Warp size:                                     32
  Maximum number of threads per block:           512
  Maximum sizes of each dimension of a block:    512 x 512 x 64
  Maximum sizes of each dimension of a grid:     65535 x 65535 x 1
  Maximum memory pitch:                          262144 bytes
  Texture alignment:                             256 bytes
  Clock rate:                                    1.51 GHz
  Concurrent copy and execution:                 Yes

Device 1: "G92-450"
  Major revision number:                         1
  Minor revision number:                         1
  Total amount of global memory:                 536674304 bytes
  Number of multiprocessors:                     16
  Number of cores:                               128
  Total amount of constant memory:               65536 bytes
  Total amount of shared memory per block:       16384 bytes
  Total number of registers available per block: 8192
  Warp size:                                     32
  Maximum number of threads per block:           512
  Maximum sizes of each dimension of a block:    512 x 512 x 64
  Maximum sizes of each dimension of a grid:     65535 x 65535 x 1
  Maximum memory pitch:                          262144 bytes
  Texture alignment:                             256 bytes
  Clock rate:                                    1.51 GHz
  Concurrent copy and execution:                 Yes

Device 2: "GeForce 8800 GT"
  Major revision number:                         1
  Minor revision number:                         1
  Total amount of global memory:                 536674304 bytes
  Number of multiprocessors:                     14
  Number of cores:                               112
  Total amount of constant memory:               65536 bytes
  Total amount of shared memory per block:       16384 bytes
  Total number of registers available per block: 8192
  Warp size:                                     32
  Maximum number of threads per block:           512
  Maximum sizes of each dimension of a block:    512 x 512 x 64
  Maximum sizes of each dimension of a grid:     65535 x 65535 x 1
  Maximum memory pitch:                          262144 bytes
  Texture alignment:                             256 bytes
  Clock rate:                                    0.81 GHz
  Concurrent copy and execution:                 Yes

Test PASSED

All cards are working perfect. Except that after the 9800gx2 gets the display, the 8800gt cant show anything. It wont even detect the display again. But who cares :) the 9800gx2 is working.

I tested a couple games (Spore and WoW). And both were at 60fps (with vsync) and 160+ fps without vsync. 1920x1200 with maximum FSAA. Full details.

Of course your mileage may vary depending on your system. :)

Post any new info you get if you try this. And feel free to contact me asking anything.
 
  • Like
Reactions: newmacdev
I just tried this with a 9800GX2 I bought today on a 8x3.2 MP and it works as advertised. Now the only thing left to do is to get Vista to play nice with the cards...

BTW, you can also get NVInject to leave the 8800 alone (that's why the 8800 can no longer power a display and takes on the name of the GX2 with the method above). Change the IOPCIMatch line in NVInject's Info.plist from

0x000010de&0x0000ffff
(matches any NVIDIA hardware)

to

0x060410de

(only matches the GX2).

This way, you'll retain full use of the 8800...

The next step is to convince the guy from here to give up his method to EFI-load the GX2 on its own via the EFI shell...

AG
 
The 9800x2 is two 8800's. How can you be sure you aren't just using one of them?


You should run better benchmarks. Wow fps can vary from 1 to 600 depending on where you are :rolleyes: try the ones Barefeats uses.

http://www.barefeats.com/harper8.html


if you take a look at the deviceQuery, you will see that the 8800gt only have 112 cores and 0.8ghz of clock.

The 9800GX2 has 128 cores (each G92 chip) at 1.5ghz each. And I successfuly uploaded kernel code to be execute to all 256 cores of the 9800gx2 via CUDA.
 
I just tried this with a 9800GX2 I bought today on a 8x3.2 MP and it works as advertised. Now the only thing left to do is to get Vista to play nice with the cards...

BTW, you can also get NVInject to leave the 8800 alone (that's why the 8800 can no longer power a display and takes on the name of the GX2 with the method above). Change the IOPCIMatch line in NVInject's Info.plist from

0x000010de&0x0000ffff
(matches any NVIDIA hardware)

to

0x060410de

(only matches the GX2).

This way, you'll retain full use of the 8800...

The next step is to convince the guy from here to give up his method to EFI-load the GX2 on its own via the EFI shell...

AG

Glad it worked for you.

I will try the NVinject fix about. Thank you very much.

EDIT: To make it work like you said I had to change 0x000010de&0x0000ffff to 0x060410de&0xffffffff. If I switch to only 0x060410de it wouldnt work. So yeah, now both cards are working 100%. Thank you.
 
Another question to answer, does the 7300gt give you the same affect as having a 8800gt in there.
 
Another question to answer, does the 7300gt give you the same affect as having a 8800gt in there.

You shouldn't expect it to work... The known cases (a Mac and a PC HD2600XT ; a Mac and a PC 8800 ; and this) all occur with very similar cards (after all, the 9800GX2 is G92, just like the 8800).

On the other hand, the only benefit of this effect is that it bootstraps the cards's fan controller (you still have to run NVinject to get the card's drivers loaded), so who knows..

AG
 
So based on what been said above, I could have both a 8800GT for OSX and then have the 9800gx2 for Vista or OSX? or is there still a problem with getting the 8800GT and 9800gx2 to behave in vista?

Thanks

Steve
 
For now I still couldnt get the 8800gt to work on OSX. As soon as OSX recognized the 9800gx2, the 8800gt goes blank. (you can still access the 8800gt on OSX for CUDA application. Its just the display that goes blank.)

On Vista64, both cards are working no problem. I am using the 8800gt as my extended desktop, with secondary display, and 9800gx2 as main display.

So basicaly, now I am using 9800gx2 for both OSX and Vista. No problem so far. Not a single crash.
 
For now I still couldnt get the 8800gt to work on OSX. As soon as OSX recognized the 9800gx2, the 8800gt goes blank. (you can still access the 8800gt on OSX for CUDA application. Its just the display that goes blank.)

On Vista64, both cards are working no problem. I am using the 8800gt as my extended desktop, with secondary display, and 9800gx2 as main display.

So basicaly, now I am using 9800gx2 for both OSX and Vista. No problem so far. Not a single crash.

Ok thanks, last couple of questions :)

Are you using external power are is everything powered from the Mac PSU?

If I only ever had the 9800GX2 connected to the monitor would the thing still display ok is OSX? (I guess minus the boot screen)

Steve
 
I am using one external PSU to power the 8800gt.

The 9800gx2 is powered using the two six-pin power cords from the mac motherboard. I am using one 6-pin to 8pin adapter for that.

I didnt try using only one monitor yet. But my guess is that the system will just not show the boot load screen. Everything else should work from there.
 
How can I do this.

I really love my 2.8GHZ octocore Mac Pro for OSX work and hevy XP gaming like Crysis, Crysis Warhead, COD4, FSX and upcoming Far Cry 2 and Call of Duty 5. I have an 8800 GT but would like more graphics power like GTX 280 or 9800 GX2. I would like to have only one card running in my system and would like to know if NVinject does all the work for me or do I have to start tweaking files to make the GX2 work. Please let me know.
 
So far, to make the 9800gx2 to work under a MacPro, you still need a 8800gt inside your mac.

First because the Mac wont even boot without an EFI card connected. Second because you need the 8800gt EFI firmware to load to "validate" the 9800gx2 to OSX.

I didnt have a chance to test the GTX280. I dont have one to test. But my guess is that wont work, since it's a new chip. But I am those kind of guys who only believe that wont work till someone really test it extensively. So, who knows. I was told inumerous times before try the 9800gx2 that wouldnt work on my real macpro for inumerous reasons.
 
And not the full power of the 9800gx2 under OSX. Only under Windows.

Go send a big thank you to apple for that.

When someone decide to drop U$ 3.2k on a computer, they expect to be able to push to the end on every aspect. This early 2008 Mac Pro is an awesome machine. But apple managed to limit the full potention of this beast by not offering a decent video card for it. By decent I mean a 9800gx2 or even GTX2x0 cards. Not a half ass 8800gt, who is an old card already nowaday. Most games are already running slow on that card. Even WoW, probably the most played games on the Mac, is running at mediocre 25ish fps on the 8800gt. Wtf... I can get better than that on a US$ 1000 PC just because of the video card.
 
And not the full power of the 9800gx2 under OSX. Only under Windows.

Go send a big thank you to apple for that.

When someone decide to drop U$ 3.2k on a computer, they expect to be able to push to the end on every aspect. This early 2008 Mac Pro is an awesome machine. But apple managed to limit the full potention of this beast by not offering a decent video card for it. By decent I mean a 9800gx2 or even GTX2x0 cards. Not a half ass 8800gt, who is an old card already nowaday. Most games are already running slow on that card. Even WoW, probably the most played games on the Mac, is running at mediocre 25ish fps on the 8800gt. Wtf... I can get better than that on a US$ 1000 PC just because of the video card.

+1 Apple should really put more effort into that. Basically, they need to work more closely with NV and AMD to come up with better driver support.
The problem, of course, is our market is probably too small to warrant a standalone OSX driver unit within NV and AMD. From what I have gathered, there still is no gpu accelerated hd video playback in os x -not that it matters in mp- but still.
Also, Apple should move to make the PSU Power more accessible to additional gpus and what not. Having to buy 2 proprietary cables for 60 bucks just to suck power from the mobo is absolutely pathetic.
 
And not the full power of the 9800gx2 under OSX. Only under Windows.

Go send a big thank you to apple for that.

When someone decide to drop U$ 3.2k on a computer, they expect to be able to push to the end on every aspect. This early 2008 Mac Pro is an awesome machine. But apple managed to limit the full potention of this beast by not offering a decent video card for it. By decent I mean a 9800gx2 or even GTX2x0 cards. Not a half ass 8800gt, who is an old card already nowaday. Most games are already running slow on that card. Even WoW, probably the most played games on the Mac, is running at mediocre 25ish fps on the 8800gt. Wtf... I can get better than that on a US$ 1000 PC just because of the video card.


You have some severe issues if you are getting anywhere around 25fps on WoW with an 8800gt - in OS X I got 70-80 fps and this is at 1920x1200,

In Windows was regularly over 100fps (which is moot point on an LCD anyway) -

This is with high settings across the board - even a single 3870 gets 60fps at same settings -

On a dual quad Mac Pro with plenty of RAM if you are dropping anywhere near 25fps in WoW then something is going on with your setup - Ironforge packed doesn't go under 50 fps on mine
 
Well.... I am talking about fights like Felmyst, Muru and KJ. Not 10man karazhan.

If you can make 100fps on those fights on your 8800gt, I would love to know whats the magic.

It looks like you have a severe lack of information.
 
I know this is not the best way to benchmark cards. But its something :)

8800gt

8800bench.png


9800gx2

9800gx2bench.png


On WoW, the same places I was used to get around 25fps on the 8800gt (graphics intensive fights like 25man end boses), I am now with around 45fps. Which it is much better.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.