Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
While we are on the subject of GPU options, why does Apple only offer ATI (AMD) desktop GPUs rather than ATI's (AMD's) workstation class GPUs? Surely they should offer a workstation card for a workstation machine? And I mean “built in” not a 3rd party option like the Quadro. Why use a gaming GPU on a system which is not really used, or even good for Gaming?

I couldn't agree more. And seeing that 3d apps increasingly are turning to using the GPU for mega-power rendering (OpenCl and CUDA)...this is going to become a game-changer for me if Apple doesn't address it.

If by the release of Lion and the next Mac Pros...if Apple hasn't
1. Addressed it's woeful OpenGL
2. Shown it can work much better with GPU vendors to land the best cards

My next major system will be a Windows beast for all my 3d work.

Apple is the richest, most powerful tech company in the world. They can do whatever they want. Increasingly I get the impression that Apple doesn't care much about the Pro user. More and more they are becoming a consumer company.

Top Apple consultant John Guber said that the future of Mac is the Air and the iMac.
 
Are you sure about this? Someone in this thread posted that they can switch cards just fine in Cinema 4D to an ATI card.

Nobody posted any such thing.

Again, the fix is simple. I have several apps that work just dandy. If Maxon can't fix this, I'd be glad to fix it for them.

Please send them this thread and volunteer. You'd make me and a lot of others very happy.
 
None of these apps provide any mechanism to select or switch to a different monitor. That's the point of the thread. (Note the thread title: How can I Designate a Graphics Card for OpenGL rendering?)

Here are 3 examples. My ATI is currently selected only because my GT120 is currently disconnected.

AE-openGL.jpg


c4d-openGL.jpg


Vue-openGL.jpg
 
Here are attached images of this problem, as you can see C4D and iStat Menus decided to choose The GT 120 instead of the 5870 default card; as seen on the system Profiler page. However, At monitor (second image) seems to detect my 5870 but it says that its Vram is not being utilised at all by the system, preferring to use the GT 120! Maybe I am reading that wrong?

Again, I don't have this problem on the Windows partition of the computer so it cannot be a hardware issue - and whoever said there is no problem with C4D is wrong;

It could be the application vendor's fault but something like iStat menus, is written expressly for the Mac and if that cannot work out the default card which, I might add, is the GPU that is actually displaying the application, what does that tell you?

I understand if a vendor like Adobe as written a lesser version compared to its windows counterpart, consequently making it their fault (and I wouldn't be surprised seeing as Apple treated them like a bunch of idiots) but most of my applications do this. So it is every application's vault? Or just, if we apply Occam's razor, an Apple Thing?

Could it be an issue with the two different Graphic Card Vendors? if it is, this conflict is not visible on Windows. Bringing it back to Apple...

This is so off-putting that I have decided only to work on the Windows partition of my Mac - I am not a "Windows Fan-Boy" but as my career, over the last eight years, as become more a more specialised I see that the Mac caters less and less to my needs even though it professes to the contrary. All I can say is thank God for Boot Camp.
 

Attachments

  • Screen shot 2.jpg
    Screen shot 2.jpg
    658.5 KB · Views: 101
  • Screen shot 2011-02-22 at 13.36.42.jpg
    Screen shot 2011-02-22 at 13.36.42.jpg
    711.6 KB · Views: 87
Last edited:
I understand if a vendor like Adobe as written a lesser version compared to its windows counterpart, consequently making it their fault (and I wouldn't be surprised seeing as Apple treated them like a bunch of idiots) but most of my applications do this. So it is every application's vault? Or just, if we apply Occam's razor, an Apple Thing?

How can it be Apple's fault when CS4 detects the correct card, CS5 not?

As goMac described (more than once), GPU detection is quite easy to do so the only ones to blame is the software vendors that apparently seem to have some quality assurance problems (for OS X software at least if you're saying that the exact same software runs without this issue on Windows).
 
Adobe Photoshop seems to be the exception to the rule; meaning that there were issues with it on Windows with default Graphic card detection. Though, I am talking about dedicated 3d applications, which employ OpenGL more closely.

After doing more experimentation I decided to download Rhino 5 for Mac and I can see that it was able to detect the 5870 card (image below), which lends to the argument that it is each application vendors fault … to some extent. However, if Apple is not aware of this and is not willing to work more closely with the range of professional applications, which are needed to class the Mac Pro as a professional machine isn’t there blame there for Apple as well?

To claim that these applications are used on the system but not to care or be aware of these issues is pure laziness on their part. For instance, only OpenGL 2.1 is employed in Rhino for the Mac when in Windows it is now on 4. Would you invest your money and place the hopes of your business on the company that can allow this? Why would you make a 12-core computer when the OS cannot utilize OpenGL 4?

Maybe the vendor’s laziness to these crucial issues demonstrates their outlook on Mac for 3D, Apple’s mutual laziness and their aloofness to the vendors in question. So I stand by what I said, for now:

Thank God For Boot Camp.

I love the Mac Pro and the Mac OS is a beautiful creation to use with but it lacks and disallows specialisation in Professional sectors, which unfortunately, forces people to go back to the Dark-Side.
 

Attachments

  • Screen shot 2011-02-22 at 15.18.37.jpg
    Screen shot 2011-02-22 at 15.18.37.jpg
    684.7 KB · Views: 89
After doing more experimentation I decided to download Rhino 5 for Mac and I can see that it was able to detect the 5870 card (image below), which lends to the argument that it is each application vendors fault … to some extent. However, if Apple is not aware of this and is not willing to work more closely with the range of professional applications, which are needed to class the Mac Pro as a professional machine isn’t there blame there for Apple as well?

Apple is perfectly willing to work with application vendors on this. As I previously noted, Apple had an entire session at WWDC (Session 422) at WWDC covering selecting a GPU on a Mac Pro with an NVidia and ATI GPU. They had an entire long session on this exact topic for developers that included live demos and sample code. You can get the video from the session if you have a developer account. They even clearly told developers that developers need to handle these sorts of situations in their code, especially as dual GPU Macbook Pros become more common (which is another situation I've heard of, dumb applications picking the integrated Intel graphics over the NVidia card.)

If application developers aren't willing to listen to Apple and create poorly coded applications, what is Apple supposed to do? Apple can't do anything about an application that is consistently picking the wrong GPU and not giving the user a choice.

Having the OS force a GPU is a bad idea for many technical reasons. You'll notice that no OS on the market lets you do this.
 
Fesco...I have the exact same problem you are demonstrating...only with the 4870 and the GT120

The screen grabs I posted showed the higher end card only because I have been forced by the problem to disconnect my GT120 from it's display (a display which now sits dormant)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.