Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

IceMacMac

macrumors 6502
Original poster
Jun 6, 2010
394
18
My question:
I have a 4870 and GT120 in my 2009 Mac Pro. Unfortunately my 3d Apps default to using the GT120 for OpenGL performance, not my more powerful 4870. I've tried swapping the cards into different slots but that made no difference. I don't see any system preference where I can select one card over the other for OpenGL.

How can I reset this, so that my 3d apps will prefer the 4870, and draw upon that card for Open GL rendering?

Thanks in advance....
 
I'm thinking it depends on the type of 3D program your using. Perhaps if that program uses CUDA which only works on Nvidia cards which the GT120 has. So perhaps thats why it defaults to that card.
 
I'm thinking it depends on the type of 3D program your using. Perhaps if that program uses CUDA which only works on Nvidia cards which the GT120 has. So perhaps thats why it defaults to that card.

Interesting hypothesis. I'm don't think that Vue or C4d use CUDA....Maybe I'm wrong about that.
 
It does not mention CUDA on Vue's website. I'm also getting indications even the most recent versions of Vue don't support CUDA.
 
It's whatever card is connected to the monitor you are drawing to.

Some apps have a known bug where they don't do this. But if the card doing the rendering is different than the card doing the output to a display you usually take a performance hit.
 
It's whatever card is connected to the monitor you are drawing to.

Some apps have a known bug where they don't do this. But if the card doing the rendering is different than the card doing the output to a display you usually take a performance hit.

goMac...My previous understanding was identical to what you are suggesting here. That's what I believed.

But that's NOT how it's working on my system. I've tried every kind of match in terms of the cards driving different monitors. The software ALWAYS defaults to utilize the OpenGL of my lower-end card.
 
It does not mention CUDA on Vue's website. I'm also getting indications even the most recent versions of Vue don't support CUDA.

CUDA is not engaged with any of my 3d apps. Not yet anyway.

I've got to wonder however, if CUDA is in After Effects' future.

Several 3d renderers now utilize CUDA, as do some AE plugins.
 
goMac...My previous understanding was identical to what you are suggesting here. That's what I believed.

But that's NOT how it's working on my system. I've tried every kind of match in terms of the cards driving different monitors. The software ALWAYS defaults to utilize the OpenGL of my lower-end card.

Some apps seem to have this bug. A developer is supposed to make a system call for a window that gets back the graphics card for that window. I'm not sure where the bug was coming from, but if I were a betting man, I'd guess that they're getting a list of the cards (in alphabetical order) and picking the first one on the list, which is bad behavior.

File bugs with the application vendors. I know for sure this bug exists in AE.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

A huge pain, but you could remove the GT120 before you render.
 
Minifridge, final renders aren't impacted at all by this problem. Final renders in most apps don't utilize OpenGL in any way.

OpenGL is employed as you are constructing and navigating 3d scenes. It allows you to quickly spin, scale, light your work. It permits you to see an approximation of your textures, lights, shadows, etc.

I purchased the GT120 solely to drive a third monitor...and permit me to manage the massive number of windows/viewports/palettes in c4d and AE. I certainly didn't expect to seriously downgrade my OpenGL performance.

This bug (so far) has made this GT120 purchase a complete waste of money.
 
Some apps seem to have this bug. A developer is supposed to make a system call for a window that gets back the graphics card for that window. I'm not sure where the bug was coming from, but if I were a betting man, I'd guess that they're getting a list of the cards (in alphabetical order) and picking the first one on the list, which is bad behavior.

File bugs with the application vendors. I know for sure this bug exists in AE.

goMac...maybe I should badger the vendors. When I inquired about this to a Maxon tech...he mentioned I should be looking to Apple. And I think he might be right. If ALL my 3d apps are doing the same thing...it sounds like a bug with Apple's OS and/or the video card drivers.

But then I'm on a dance floor trying to get one of three parties to fess up and do something: Is it Apple? Nvidia? ATI?

Again...more and more I think it's inadvisable to mix and match ATI/Nvidia products in one machine.
 
goMac...maybe I should badger the vendors. When I inquired about this to a Maxon tech...he mentioned I should be looking to Apple. And I think he might be right. If ALL my 3d apps are doing the same thing...it sounds like a bug with Apple's OS and/or the video card drivers.

I write a lot of apps that use the GPU, and specifically test my screen output on multicard machines (my Mac Pro is a 5870/120 dual machine). Never had an issue and this is covered extensively in Apple's developer docs. I highly doubt this is a problem on Apple's end. :)

But then I'm on a dance floor trying to get one of three parties to fess up and do something: Is it Apple? Nvidia? ATI?

It's the app vendor.

Again...more and more I think it's inadvisable to mix and match ATI/Nvidia products in one machine.

Has nothing to do with the mix. Mixed cards work just fine on my machine.

In addition, a lot of stuff on my machine (including games) will switch cards just fine with this combo when I drag the window between displays.

Apple even makes it easy for an app to manually choose a card, or even use multiple cards at once in an app. Given that Apple has given an app the power to choose a card entirely on it's own, and your apps are choosing the card wrong, it sounds like a problem on their end. I don't think multicard setups are tested very often, so this is a common bug I see with Mac apps. However, I see a lot of Mac apps that do it right, including my own code, so I really doubt this is a problem on Apple's end.

(I've also worked places that have had this issue and simply haven't given a crap.)

Also, remember, Apple currently ships machines with GPUs from more than one vendor that are both active at the same time. Running multi-vendor setups is ok.
 
Last edited:
Apple even makes it easy for an app to manually choose a card, or even use multiple cards at once in an app. Given that Apple has given an app the power to choose a card entirely on it's own, and your apps are choosing the card wrong, it sounds like a problem on their end. I don't think multicard setups are tested very often, so this is a common bug I see with Mac apps.

goMac is absolutely correct. I also use Cinema 4D. Which video card powers which viewport should be an option in that viewport's "Configure" pane (Shift+V). It is the application's responsibility to detect and configure its own OpenGL usage.
 
I have that exact same problem Friend,

This thread gives you a workaround

https://forums.macrumors.com/threads/901521/

Apparently you need to unplug the screen connected to your GT 120 then launch the program in question; it should automatically default to the 5870 card, reconnect the secondary screen.

I'm sure it is an Apple problem because I have no trouble with default GPU usage on the windows side of my Mac Pro – Rhino, 3D S max and Photoshop default to the 5870 with no problems.

Even using iStat menus on the Mac seemed to default to the GT 120 and ignore my 5870 only when I unplugged the screen to my GT 120 does iStat menus show GPU usage with the 5870:

Ergo it is an Apple thing.

Initially, Photoshop on the windows side had that issue but when I updated the 5870 drivers it solved the problem.

It’s a real irritation when, as limited as the GPU options are on the Mac, any (legal) attempts at making the best of a bad situation makes the situation worse.

Here is a YouTube video explaining a method of getting a 5870 to work with a 5770 on the Mac Pro, it may be of some help to you. I will be doing this at some point - though I’m still feeling the pain of spending £200 on a useless card.

http://www.youtube.com/watch?v=ui6q_6k1vxA
 
?

While we are on the subject of GPU options, why does Apple only offer ATI (AMD) desktop GPUs rather than ATI's (AMD's) workstation class GPUs? Surely they should offer a workstation card for a workstation machine? And I mean “built in” not a 3rd party option like the Quadro. Why use a gaming GPU on a system which is not really used, or even good for Gaming?
 
Same problem here with Photoshop CS5.

PS automatically uses the GT120 instead of the 4870 although PS runs on the screens connected to the 4870.
Unfortunately, the only solution to this problem is unplugging the GT120's screens. Not nice!

However, CS4 doesn't have this issue and utilises the 4870. Apparently Adobe screwed up (again ;)).
 
I'm sure it is an Apple problem because I have no trouble with default GPU usage on the windows side of my Mac Pro – Rhino, 3D S max and Photoshop default to the 5870 with no problems.

Even using iStat menus on the Mac seemed to default to the GT 120 and ignore my 5870 only when I unplugged the screen to my GT 120 does iStat menus show GPU usage with the 5870:

Ergo it is an Apple thing.

Initially, Photoshop on the windows side had that issue but when I updated the 5870 drivers it solved the problem.

There are a few possible conclusions to these findings:

1. Windows developers (for the same applications) choose to code video card detection into their apps, and select the appropriate one.
2. Windows does the video card detection and selection for the applications.

I would be very skeptical of number one. So that leaves number two. Which then spawns the question: Is it actually correct for the OS to take charge of the video card detection/selection for applications?

I would argue that it is not. The OS is not aware of any application specific parameters. What if I have a 3D program with two viewports and I want them both driven by different video cards (split the load)? In the Windows model, is this even possible? Or does the OS simply decide that the whole application uses the faster card? In the Apple model, this is possible in the program's preferences assuming the developer takes the time to set it up. I think Apple's default (use the video card driving the display) makes sense.

Now this bug, as goMac says, doesn't seem like an Apple bug to me.
 
I'm sure it is an Apple problem because I have no trouble with default GPU usage on the windows side of my Mac Pro – Rhino, 3D S max and Photoshop default to the 5870 with no problems.

Even using iStat menus on the Mac seemed to default to the GT 120 and ignore my 5870 only when I unplugged the screen to my GT 120 does iStat menus show GPU usage with the 5870:

Ergo it is an Apple thing.

This is not an Apple thing. Apple gives the developers the power to pick any GPU they want. I haven't had any issues with this in my own code, and as other posters have noted, others programs work just fine.

If other programs on OS X are working just fine with dual GPUs, this is an Adobe problem. This is even more likely given that it works in CS4 and breaks in CS5.
 
This is not an Apple thing. Apple gives the developers the power to pick any GPU they want. I haven't had any issues with this in my own code, and as other posters have noted, others programs work just fine.

If other programs on OS X are working just fine with dual GPUs, this is an Adobe problem. This is even more likely given that it works in CS4 and breaks in CS5.


goMac...As I have mentioned this bug impacts C4D and Vue and AE on my machine. And one of the guys here as mentioned iStatMenu as showing the same problem. So it is not just an Adobe problem. Please desist that notion.

I have raged in many forums and threads about the horrendously buggy software coming out of Adobe-let's-offshore-our-code-Incorporated these days. But it's not fair to single them out in this case.
 
goMac...As I have mentioned this bug impacts C4D and Vue and AE on my machine. And one of the guys here as mentioned iStatMenu as showing the same problem. So it is not just an Adobe problem. Please desist that notion.

I'm a graphics developer, and I'm telling you my code has no issues picking a GPU under OS X, and I have several other programs that are switching between GPUs just fine under 10.6.6. Because on a developer level I can verify that Apple's APIs are working (and because CS4 performs properly), this only leaves the applications themselves as the culprit.

I have raged in many forums and threads about the horrendously buggy software coming out of Adobe-let's-offshore-our-code-Incorporated these days. But it's not fair to single them out in this case.

Definitely looks like Adobe here (and the other application vendors).
 
Definitely looks like Adobe here (and the other application vendors).

Options:

1. Blame Apple

or blame
2. Maxon (and their superior engineers), E-On Software, Adobe...and who knows how many other app vendors...


3. or blame the Display makers for bad drives



I'm not knowledgeable enough to know which of these three. But I'll say this: Unless somebody has an agenda here you can't single out Adobe. No way. And again...I've been a bigtime critic in recent years of Adobe's engineers.
 
2. Maxon (and their superior engineers), E-On Software, Adobe...and who knows how many other app vendors...

...

I'm not knowledgeable enough to know which of these three. But I'll say this: Unless somebody has an agenda here you can't single out Adobe. No way. And again...I've been a bigtime critic in recent years of Adobe's engineers.

Again, as a developer, I have no problem picking which card I want to use. If I can do it just fine, than so can Adobe. Adobe's failure is no fault but of their own. They can write the exact same code I can. In addition, I know of several other programs doing this properly.

Given those facts, why do you continue blaming Apple? Are you saying I'm a better engineer than those at Adobe of Maxon? It's only a few lines of code.

I'm not sure what the disconnect is. Another user is reporting that they can manually switch cards under Cinema 4D (showing it works.) It works fine under CS4 (showing it works.) I've written code that switches cards, and even uses both at the same time (showing it works.) There are numerous Apple developer documents on this (showing it works.) Every piece of evidence points to this being an application bug. Every single piece of evidence also points to this working fine under OS X, given that it's been repeatedly demonstrated that this behavior works just fine. Whatever holds true for my app also holds true for Adobe. Not to mention, EVERY single Macbook Pro is currently dual GPU. Would Apple break this for every single Macbook Pro?
 
Last edited:
goMac

As the OP here let me again stress
Adobe causes me ZERO heartburn in this matter. As with most AE veterans...we don't even use Open GL in AE...we turn it off in preferences, and are just fine without it. AE is not a real 3d program.

The central issue is with my 3d apps. And yes I think Maxon probably has much better engineers than you and could figure such a thing out quite easily if it were as simple as you are proposing.

What we do know for absolute certainty:
As is widely known by all Apple bites when it comes to OpenGL support. Year after year after year...they SUCK at it. They are years behind Windows when it comes to OpenGL. Since this whole thread is about an OpenGL issue...well you can't help but wonder if Apple bears some culpability.

I can't say for sure...maybe Maxon, E-On Software and others are just lame and can't figure out something that you can do in 20 minutes...but it sure sounds like an Apple issue to me.
 
Another user is reporting that they can manually switch cards under Cinema 4D (showing it works.)

You are completely misreading somebody's post. No one in this thread said as much.

This thread wouldn't even exist if C4d could do that. I started this thread centrally because of c4d and this openGL problem.

You are the one who has errantly brought Abobe into this discussion. Adobe has almost NOTHING to do with this discussion as far as I'm concerned. Yet you seem hell bent on injecting your Adobe frustrations into the discussion.
 
The central issue is with my 3d apps. And yes I think Maxon probably has much better engineers than you and could figure such a thing out quite easily if it were as simple as you are proposing.

Are you sure about this? Someone in this thread posted that they can switch cards just fine in Cinema 4D to an ATI card.

Again, the fix is simple. I have several apps that work just dandy. If Maxon can't fix this, I'd be glad to fix it for them.

What we do know for absolute certainty:
As is widely known by all Apple bites when it comes to OpenGL support. Year after year after year...they SUCK at it. They are years behind Windows when it comes to OpenGL. Since this whole thread is about an OpenGL issue...well you can't help but wonder if Apple bears some culpability.

This actually has nothing to do with the drivers... it's one level up...

I can't say for sure...maybe Maxon, E-On Software and others are just lame and can't figure out something that you can do in 20 minutes...but it sure sounds like an Apple issue to me.

It's not. I don't have this issue. CS4 didn't have this issue.

If on the same system, CS4 doesn't have the issue, but CS5 does, what does that tell you?

Edit:

For developer reference, Apple has an entire session in the WWDC 2010 library on handling multiple GPUs from multiple vendors. It is session 422. They do live demos of switching between and using both an NVidia and ATI card on the same Mac Pro on Mac OS 10.6. They also do code descriptions on what you need to do to switch between GPUs, or request a specific GPU. If you need to, I'm sure you can refer the engineer teams of the apps with this issue to this session.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.