Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Bleh similar situation with both of those companies. Some things run really well under OSX, but the OpenGL performance is annoying. The Quadro cards basically consist of more ram and adjusted drivers. There isn't anything more to it. Even the level of technical support isn't significantly better. Unfortunately whenever we see one under OSX, the drivers seem to have significant issues. This is both an NVidia and an Apple issue. Apple is very restrictive on their code, and I imagine they haven't been so great about working with NVidia on this especially seeing as the Mac Pro + Quadro user base is relatively small (in Apple terms). What I'd like to see is 10 bit displayport on the mac. I doubt it'll ever happen. Supposedly it worked with a few cards a while back. Lion and thunderbolt lack drivers for this.

You know, I hear this from some people and the exact opposite from others.

The truth is that there is not a major hardware difference. But that does not mean that Quadro cards do not offer increased speed inside of professional apps that utilize the card.

You say different drivers as if it is a bad thing when in reality, the hardware of anything is only as good as the software. You're trying to play it off like there isn't a big difference when that is not at all true. There is a huge difference between Quadro/Firepro cards and their consumer grade counterparts. But if you are not using professional applications that utilize the benefits of a professional card and just need the power behind a consumer grade card, go for it. But it would be a lie to say that there is not a difference between professional cards and consumer cards.

And OS X has awful support for professional cards. That cannot be said about Windows, as much as it pains me to say. So from a guy who does all his work in Autodesk applications and some Adobe applications, I find it laughable that you try and play it off as if there is not a difference between a professional card and a consumer card.

Just so you know I am not making this up, it is obvious that even a Firepro v8750 kills an ATI 5870 inside of Maya. The interesting part is that there is such a substantial speed difference even though the 5870 clearly has more raw power.

Now the price difference at the time was 4.5 times more expensive. But some of the benchmarks show nearly a 10 time speed increase on the Firepro card. Now consider that a Quadro 5000 kills the Firepro card in nearly every benchmark and couple that with CUDA support for the Mercury playback engine and nearly the same list price of around $1,800, there is no way you can even try and downplay the benefits of a professional GPU.

EDIT: Not to mention the Quadro 5000 even beats the newer Firepro V8800 and V9800 in benchmarks in popular 3D programs, Maya in particular since that is what I use most. Now consider that the V9800 is $1000 more expensive than the Quadro 5000.
http://hothardware.com/articleimages/Item1565/9800_maya.png
http://hothardware.com/articleimages/Item1565/9800_catia1.png
http://hothardware.com/articleimages/Item1565/9800_ensight.png
http://hothardware.com/articleimages/Item1565/9800_lightwave.png

So just because a consumer card is cheaper does not mean it is better. Just because the specs are better does not mean it will always perform better either. There is more to it than numbers on a specifications page. Also, I realize this is not a Quadro vs Firepro thread, just lost control a bit ;)
 
Last edited:
Whats this feature officially called so I can buy a TV without it? (Sorry OP for derailing the thread).

Each manufacturer has their own name for it.

Auto Motion Plus
Clear Motion Rate technology
Subfield HD Motion
Cinema Smoothening

To name a few...
 
You know, I hear this from some people and the exact opposite from others.

The truth is that there is not a major hardware difference. But that does not mean that Quadro cards do not offer increased speed inside of professional apps that utilize the card. :(

You say different drivers as if it is a bad thing when in reality, the hardware of anything is only as good as the software. You're trying to play it off like there isn't a big difference when that is not at all true. There is a huge difference between Quadro/Firepro cards and their consumer grade counterparts. But if you are not using professional applications that utilize the benefits of a professional card and just need the power behind a consumer grade card, go for it. But it would be a lie to say that there is not a difference between professional cards and consumer cards.

And OS X has awful support for professional cards. That cannot be said about Windows, as much as it pains me to say. So from a guy who does all his work in Autodesk applications and some Adobe applications, I find it laughable that you try and play it off as if there is not a difference between a professional card and a consumer card.

I may have over stated it a bit. The higher end cards you refer to are quite a jump from some of the lower quadro options. OSX doesn't have anything past the Quadro 4000, and that was full of bugs (not sure if they've been fixed). I would pay the extra for a quality gpu under Windows. It's just you don't get as much out of it under OSX as you stated :p. That maya link doesn't mention the Quadro 4000 under Windows :(. I was wondering how it performs there.

Earlier much of what I was getting at is that a big function of buying a Quadro is the driver version. Apple doesn't allow access to much of its code, and we end up with a sub par driver. Correct me again if I'm wrong, but doesn't that kill much of the value of owning such a card on a Mac unless you're working in bootcamp frequently?
 
Long refresh cycles save money.

I bought a 2008 2.8 Octo Mac Pro too, 4 years later the average i7 powered PC tower is just beginning to beat it for CPU performance. Do I now need to say "Help! Help! The crap boxes have caught up! Quick make me spend thousands on a new Mac Pro!" ?
Er... no, my Mac Pro still does what it did before and thanks to upgrading the RAM to 16 GB, the GFX to 5870 and adding a SSD, it does it even faster than it previously did. Workstations benefit from longer life due to upgradable components not just replacing the old broken stuff with the same.
In 4 years my MP 08 has risen from 7000 Geekbench scores to 11000 for a modest investment in newer parts and expansion.

Longer refreshes give your business plenty of time to gain more profit from fast, reliable hardware that allows you to choose when to replace it, while spending the saved expenditure on other things your firm may benefit from too.
Patience, foresight and proper budgeting will be rewarded with a newer faster Mac eventually, till then if you don't like the prices and the "2 year old" specs keep yer money where it is needed or invest in something else that will help your business grow.

I only fix stuff when it breaks, what do you do when it doesn't? :D
 
Last edited:
OSX doesn't have anything past the Quadro 4000, and that was full of bugs (not sure if they've been fixed).

The Quadro 4k in my Mac has been rock solid since I installed it. I'm running Lion, FWIW, and it has excellent nVidia drivers included with it. The only extra drivers that are required are the CUDA ones from nVidia directly. You don't need to add baseline card drivers like you do with Snow Leopard.

As has always been the case: nVidia's hardware is quite powerful, but the monkeys they have writing drivers don't always do the right thing. With Lion, the only thing you're required to install is the CUDA driver package from nVidia, thereby reducing your "nVidia-written drivers" potential disaster by 50%. ;)

jas
 
Did we really need another "Mac OS is not for the pros" thread?

No, but we do probably need the regular reminders :D

CS5 works fine with Lion. Not to mention this is Adobe's software not Apples. Its Adobe's responsibility to upgrade it. If you want Windows, use it, but Microsoft is aiming for a lot shorter release schedule too....Professionals use software on an OS to get the job done. If its too hard for someone (not implying you I'm speaking in general) to adapt to a new workflow then they shouldn't be in a job that requires computers.

Well said, although workflow changes hit every industry ... hey, I need to write a letter - and ask the secretary to use carbon paper :eek: to type it up in duplicate, please...

I own and love my Mac Pro...but this is my last. Once this one kicks off I will not be buying another. My MBA is a GREAT mac that does everything I need it to do, and for intense computations is will be cheaper and better to just build a PC/Linux box.

Bully for you. Of course, we're all different in our needs ...


BTW, I work in MANY MANY facilities in LA. And nearly all of them that do graphic design are using iMacs! I only see Mac Pros when real power is needed, like for what I do, and animators, etc. iMac is basically taking over tho for design, and for the high end needs, it is mainly linux boxes now.

So?

Isn't all that you're observing is simply that the number of tasks which require significant investments for major iron for the 'really heavy lifting' tasks is naturally becoming smaller? It appears to me that current technology is ahead of our expectations for what a PC generally should be able to handle...which IMO isn't too particularly surprising because ...just when was our last new "Killer App" which has driven hardware demands?

If an iMac can perform today's tasks, then so be it.

What remains unchanged is that the core business case will only ever spend the bucks when it has a benefit of making an unproductive employee more productive. If that employee could be just as productive with a pad of paper, an IBM Selectric and a box of carbon paper, then that $1500 iMac wouldn't have been purchased either...or even a $300 Dell cheap-o tower.

Overall, the broader question here is what is the future for the niche market of higher/highest performance systems, for which we often are considering the wisdom of investing some fairly significant extra dollars for a relatively modest performance gain, due to the Law of Diminishing Returns?

What makes it a more complicated business analysis here is that a stand-alone PC isn't the sole approach: there's also alternatives such as render farms to consider too, etc.

And no matter what solution is considered, each one is going to have their own ROI, as well as other trades.

-hh
 
I HATE that mode!!! Whats worse is some of my friends don't see it! I try to explain it to them and the best explanation of what I can come up with is "frame interpolation" since it appears thats what this "feature" is doing, yet everyone thinks I'm crazy and thinks the picture looks fine.

Whats this feature officially called so I can buy a TV without it? (Sorry OP for derailing the thread).

240hz playback automatically makes everything look like crap! I don't get it either. I use a Mac Mini as a HTPC and I tend to either use 60hz (NTSC) or 24hz (for 24p movies) and it looks fantastic!!
 
No, but we do probably need the regular reminders :D

;)

240hz playback automatically makes everything look like crap! I don't get it either. I use a Mac Mini as a HTPC and I tend to either use 60hz (NTSC) or 24hz (for 24p movies) and it looks fantastic!!

Whats worse is many people don't see it! (They think I'm crazy). I don't get how they don't. I spot it instantly. It actually makes me sick seeing it.
 
;)



Whats worse is many people don't see it! (They think I'm crazy). I don't get how they don't. I spot it instantly. It actually makes me sick seeing it.

I work in feature film VFX....so I look at moving pictures all day, ever day. I can spot bad frame rates, bad AR, and dropped frames like its nothing.

Before I started using XBMC the poor frame rates of flash video on hulu drove me CRAZY!!! Could barely stand to watch it.
 
I may have over stated it a bit. The higher end cards you refer to are quite a jump from some of the lower quadro options. OSX doesn't have anything past the Quadro 4000, and that was full of bugs (not sure if they've been fixed). I would pay the extra for a quality gpu under Windows. It's just you don't get as much out of it under OSX as you stated :p. That maya link doesn't mention the Quadro 4000 under Windows :(. I was wondering how it performs there.

I remember hearing that the Quadro 4000 was awful in comparison to a Quadro 4000 under Windows. I am also not sure if the issues have been fixed. NVIDIA seems to think that the Quadro 4000 and the Mac version have the same performance, but I sort of doubt that they actually did any real world testing on that,

Earlier much of what I was getting at is that a big function of buying a Quadro is the driver version. Apple doesn't allow access to much of its code, and we end up with a sub par driver. Correct me again if I'm wrong, but doesn't that kill much of the value of owning such a card on a Mac unless you're working in bootcamp frequently?

Yes it totally does. And that is why I plan on building my own workstation when it is time to upgrade my Mac Pro. However, for OS X usage I will still be using a MBP. But all my intensive programs will need to be run under Windows.

And even if you are working in Windows most of the time via bootcamp, having to switch your monitors between GPUs can be a pain.

On a side note, I have heard that even AMD's cheaper Firepro option will out perform any consumer grade card, even $600 ones, when it comes to professional applications. However, I am not sure how it would do against multiple high-end consumer cards. But it just goes to show how important software is when working inside of professional applications.
 
...just when was our last new "Killer App" which has driven hardware demands?
... workin' on it!* :)

* [in Geordi La Forge voice as he slides his chair-on-rails across engineering frantically doing stuff]

I am very satisfied with the 5870, but want more. Want 7980! FWIW, I tried my thingie on an ultra-maxed Pro c/ Quadro 4000 in the lab to see how that card compares. Now, I didn't specifically tune for that card, but it was the same code I'd finely tuned for the MBP's 330M, and its throughput was *crummy*, like 30%, vs. the 5870. (I did though have to vectorize to float4s to get the most out of the 5870's VLIW architecture.)

I send 60% of the (OpenCL) job to the gpu and 40% to the (4-core) cpu. With a 12-core Pro I could do without the gpu. With a new Pro and a 7980, I could do some mix of higher res, faster frame rate, and second screen. Very excited about the possibilities.

If Apple *were* to discontinue instead of updating (yeah right), I'd probably free up some money to buy a couple of 12-cores to stash away. If I had to change platforms, I wouldn't have the luxury of simply waiting for a vendor to update and then shelling out a few hundred bucks; I'd be looking at hundreds of hours of very painful work.
 
And even if you are working in Windows most of the time via bootcamp, having to switch your monitors between GPUs can be a pain.
I have a cheap iogear KVM switch with a remote that works great for switching my PC and Mac desktops at the touch of a button, but I'm using two physical desktops. :)
 
If you are old enough, which I doubt, I will quote Rhett Butler's ending lines in, Gone With the Wind." Frankly my dear, I don't give a dam!" Rant all you want and get it out of your bleak system, but do it elsewhere.
 
I remember hearing that the Quadro 4000 was awful in comparison to a Quadro 4000 under Windows. I am also not sure if the issues have been fixed. NVIDIA seems to think that the Quadro 4000 and the Mac version have the same performance, but I sort of doubt that they actually did any real world testing on that,



Yes it totally does. And that is why I plan on building my own workstation when it is time to upgrade my Mac Pro. However, for OS X usage I will still be using a MBP. But all my intensive programs will need to be run under Windows.

And even if you are working in Windows most of the time via bootcamp, having to switch your monitors between GPUs can be a pain.

On a side note, I have heard that even AMD's cheaper Firepro option will out perform any consumer grade card, even $600 ones, when it comes to professional applications. However, I am not sure how it would do against multiple high-end consumer cards. But it just goes to show how important software is when working inside of professional applications.


That reminds me, do you know of any good curvature or surface continuity diagnostic tools for maya? It's pretty annoying modeling complex surfaces within it. I'm not saying it's impossible or anything. I'm not blaming the tool. I just wish it had some kind of diganostic tools (apart from test rendering several angles with glossy material applied) especially as what you see on screen is an approximation. Mostly I wanted to achieve better highlight flow without trying to create it via shader voodoo. I'm kind of weird about these things, and it has to work at higher resolution print outputs which is why I wish it had CAD like diagnostic tools.

Hehe... considering you, Chrono, and Gentlefury are all reading the same thread, I figured someone might have an idea.

The firepros under Windows support 10 bit displayport output, which pleases me immensely. At one point Apple would brag about advances like this and try to implement them before Windows. Now I feel like they're just trying to coast on the maturity of the platform and its entrenched customer base + trendiness factor. It's quite annoying as there are things that I really really really like about it, and I've been using OSX for almost a decade.
 
I have a cheap iogear KVM switch with a remote that works great for switching my PC and Mac desktops at the touch of a button, but I'm using two physical desktops. :)

So, something like this or this?


That reminds me, do you know of any good curvature or surface continuity diagnostic tools for maya? It's pretty annoying modeling complex surfaces within it. I'm not saying it's impossible or anything. I'm not blaming the tool. I just wish it had some kind of diganostic tools (apart from test rendering several angles with glossy material applied) especially as what you see on screen is an approximation. Mostly I wanted to achieve better highlight flow without trying to create it via shader voodoo. I'm kind of weird about these things, and it has to work at higher resolution print outputs which is why I wish it had CAD like diagnostic tools.

Hehe... considering you, Chrono, and Gentlefury are all reading the same thread, I figured someone might have an idea.

Sorry, I can't think of anything at the moment. If something comes to mind I'll try and remember to post it on here.
 
So, something like this or this?
I think I got this one, but don't think I paid that much for it.

670411.jpg
 
I think I got this one, but don't think I paid that much for it.

Image

I definitely see where that would come in handy. It also looks a lot like the Rosewill one I saw.

However, if you have a monitor with multiple inputs and can manually select the input, wouldn't that work just as well? Say you have a monitor with DVI and Displayport. You plug DVI into say a Quadro card and then Displayport into a consumer card for other programs and windows. Would you then just be able to change the input? Or would the computer see it as a monitor on both GPUs, thus making a KVM switch necessary?
 
I definitely see where that would come in handy. It also looks a lot like the Rosewill one I saw.

However, if you have a monitor with multiple inputs and can manually select the input, wouldn't that work just as well? Say you have a monitor with DVI and Displayport. You plug DVI into say a Quadro card and then Displayport into a consumer card for other programs and windows. Would you then just be able to change the input? Or would the computer see it as a monitor on both GPUs, thus making a KVM switch necessary?
I didn't know how a monitor would react to having two inputs, and figured that mounting the button in an ideal place would be nicer. I can just tap the button, whereas if you had two inputs on a monitor, you'd have to reach up to the monitor, and push what I assume is not as simple as one tap on various buttons. It sounds more simple to just plug two cables into one monitor, but in actual use it may turn out much less convenient. :p
 
So, something like this or this?




Sorry, I can't think of anything at the moment. If something comes to mind I'll try and remember to post it on here.

Thanks man. There are probably better programs for what I'm trying to do, but it gets quite expensive just buying more and more software. I also don't like to get too tied to obscure programs in case I'm not working on my own computer.


I definitely see where that would come in handy. It also looks a lot like the Rosewill one I saw.

However, if you have a monitor with multiple inputs and can manually select the input, wouldn't that work just as well? Say you have a monitor with DVI and Displayport. You plug DVI into say a Quadro card and then Displayport into a consumer card for other programs and windows. Would you then just be able to change the input? Or would the computer see it as a monitor on both GPUs, thus making a KVM switch necessary?

I tested it on two different computers. Switching inputs via OSD seems to work. Both were running OSX. I could go into bootcamp on one. Bootcamp seems to be slightly finnicky if it doesn't see the display while booting as opposed to OSX where you can hot plug anything.
 
I didn't know how a monitor would react to having two inputs, and figured that mounting the button in an ideal place would be nicer. I can just tap the button, whereas if you had two inputs on a monitor, you'd have to reach up to the monitor, and push what I assume is not as simple as one tap on various buttons. It sounds more simple to just plug two cables into one monitor, but in actual use it may turn out much less convenient. :p

Ya, and apparently one of my monitors can switch inputs so I just tried it and OS X still thinks there is a monitor there even though it was set for a different input.

Now my use of a KVM would really just be switching a monitor between two GPUs instead of between two separate computers. I found a great tool in OS X to do this called SwitchResX, totally free and works perfectly. So I can easily disable a monitor and OS X will not longer think it is there. Do you know if there is a Windows equivalent to this? Except instead of just disabling a monitor, I would need something that can disable individual inputs and be able to switch between them. Or would a KVM just be better?
 
Ya, and apparently one of my monitors can switch inputs so I just tried it and OS X still thinks there is a monitor there even though it was set for a different input.

Now my use of a KVM would really just be switching a monitor between two GPUs instead of between two separate computers. I found a great tool in OS X to do this called SwitchResX, totally free and works perfectly. So I can easily disable a monitor and OS X will not longer think it is there. Do you know if there is a Windows equivalent to this? Except instead of just disabling a monitor, I would need something that can disable individual inputs and be able to switch between them. Or would a KVM just be better?
I don't, but it sounds cool! I'll have to look for something.
 
I don't, but it sounds cool! I'll have to look for something.

It seems like KVMs are more for switching between multiple computers and might be a bit overkill for what I need. Especially if I could find a cheap software solution, that may be more practical than a KVM for my use.
 
I am a diehard matte screen user who owned and sold an older 20in 2007 iMac but I have got used to a newer 27in iMac in a new office with controlled lighting. The real drawback is the increased contrast which is a pain with design projects. I have to use a second monitor to get a real world example and then modify my design on the imac. Kind of defeats the point of an AiO.

Sure I can get by with an iMac but I will always prefer a tower with internal raid + matte screen. I don't hate Windows but they would have to take away everything but the MBA and turn OSx into iOS to get me to abandon the platform completely.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.