Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TrumanLA

macrumors member
Original poster
Jan 1, 2017
69
15
USA
I'd like to take a vote as to which GPU you think is the best bang per buck for each program:

FCP 7
FCP X
Premiere
After Effects
DaVinci


I'm partial to nVidia GPUs and assume the hierarchy looks like this:

GTX 680 2GB
GTX 680 4GB
GTX 980 4GB
GTX 980 Ti 6GB
Titan X Maxwell 12GB

Thanks for any and all who participate
 
Last edited:
GTX 980 prices have dropped nicely since the arrival of the 10 series (Pascal). I wouldn't get anything less than that.

Even though you might find it odd I would invest on AMD cards. Apple supposedly will keep using them for the indefinite future so I would suspect they will heavily optimise for those.
There are some nasty bugs with Nvidia cards and there are zero plans for them to fix them.
 
  • Like
Reactions: pat500000
Fcpx - a and card, I use the r9 380

Adobe - nvidia, I find after fx a slug and always prefer using motion
 
Great advice re: pricing of the 980... but honestly, some people don't even have that money. I'm just looking for the best CUDA cards.

Also, Phobos, you make an excellent point. I hear OpenCL is very similar to CUDA in how it works. Problem is, while I can flash (or buy flashed cards) for nVidia... the ATI cards which are the most powerful are listed by Netkas as absolutely not working in newer OS. What ATI card even remotely rivals a Titan X (Maxwell obvi)... and what ATI card that costs $120 rivals a 680? Unless of course, there's an implicit premise to your point that we're focusing on FCP7 or FCPX ..?

And please note: I KNOW I have erroneous and ignorant views, as all my knowledge is second hand, inference, etc. I don't game nor edit/render/color correct video... so I must rely on the opinions of you all.

Thanks to those of you who've already helped. I greatly appreciate it. :)
 
Be careful that Maxwell card may shows glitches in Adobe apps.

I don't think FCP 7 can use modern GPU power at all.

FCP X is highly optimise to AMD card. My dual 7950 setup only cost $200 nowadays and can beat the TitanX in FCPX.

Not familiar with DaVinci, but are you sure it works with CUDA? The card has CUDA doesn't mean that the software can use it to compute.
 
I currently use Adobe CC apps (Premiere Pro, After Effects, Photoshop, Illustrator) on my MP 5,1. I have a GTX780 in it. With the release of Sierra I noticed I can now choose to use CUDA or OpenCL in PP or AE.

From what I have read Apple supplies drivers for Nvidia cards up to the GTX780 and they updated those drivers with Sierra. I have thought about upgrading the card but that would mean manually updating drivers and since everything works great now (smooth and fast) - well, if it ain't broke I have no plans to upgrade for now.

Lisa
 
WONDERFUL info guys.

I have access to 980Ti, 980, 680 -- and would be willing to buy some cards based on recommendations of you guys here so that I can provide quick demonstrations for people.

I did read a thread in which it's explained that the 7950 corresponds to one of the R series cards and so on. Memorizing the way the nomenclature lines up will take me a little time to memorize, but I'll get that.

Does anyone here have recommendations of how to demonstrate the trade offs of the cards? I.e., show the lackluster performance of a Titan X in FCP X vs. the optimal R8 setup, and so on?

THANK YOU all, VERY much. This is exactly what I need.

As an aside, I've been trying to stock up on the SM951 drives for people to give them the fastest bootable drives... as well as using USB 3.1 rev-2.

Obviously, I want to use the best CPU for the money which seems like the 6x 3.33GHz...
The best bang-per-buck bootable SSD seems like it's the use of a MBPr x2 SSD in 256GB
The fastest bootable SSD is the SM951
I believe the fastest (and most practical, as eSATA isn't on many devices) seems like USB 3.1 rev2
And of course, the GPU that is optimal for their biggest weekly slowdown.
Mirrored RAID that rebuilds when a drive is replaced (7K4000 is my default)
3x module size for 24, 48, 96 GB RAM -- etc.

Does anyone see any smart upgrades that I'm overlooking?
[doublepost=1483413408][/doublepost]
I currently use Adobe CC apps (Premiere Pro, After Effects, Photoshop, Illustrator) on my MP 5,1. I have a GTX780 in it. With the release of Sierra I noticed I can now choose to use CUDA or OpenCL in PP or AE.

From what I have read Apple supplies drivers for Nvidia cards up to the GTX780 and they updated those drivers with Sierra. I have thought about upgrading the card but that would mean manually updating drivers and since everything works great now (smooth and fast) - well, if it ain't broke I have no plans to upgrade for now.

Lisa


I have a GTX 780 in stock -- I could compare that to a 980 and a 980 Ti -- what should I use to evaluate the comparative performance in the most demonstrable manner?
 
Even though you might find it odd I would invest on AMD cards. Apple supposedly will keep using them for the indefinite future so I would suspect they will heavily optimise for those.
There are some nasty bugs with Nvidia cards and there are zero plans for them to fix them.

What bugs are those and what is the source that there are zero plans to fix them?
 
Even though you might find it odd I would invest on AMD cards. Apple supposedly will keep using them for the indefinite future so I would suspect they will heavily optimise for those.
There are some nasty bugs with Nvidia cards and there are zero plans for them to fix them.

As far as Apple's game plan to stick with ATI -- that's impossible to be "indefinite." We merely have no knowledge of when they'll change. Given that they have switched from each each 2-3 years has caused us a false sense of confidence that their purchasing will remain static. ALL Apple is doing is manipulating nVidia to lower their prices. Maybe nVidia didn't cover a GPU issue properly... maybe their GPU failure rate was irrational with respect to their speed advantage and liability. The [only] thing we know -- is that Apple switched from nVidia in 2014 to mid-2015... and subsequently [appear] to have doubled down by using the ATI in the new form-factor MBPr with the TB3 (which I look forward to discussing with everyone).

That in NO WAY precludes Apple from switching back to nVidia in a decision to not only continue the previous form-factor of the MBPr (with TB2) with subsequent GPU revisions that COULD switch to nVidia.

While I recognize that they could virtually abandon revisions to the TB2 version of the MBPr and just keep it parallel to the TB3 series... in the same sense that they kept the MBP Unibody along side the retinas for 2013-2014 only to retire it... they COULD KEEP the previous version MBPr which has a variety of ports and a REMOVABLE Flash drive -- and continue UPGRADING that model in to the future... indefinitely. This is perfectly plausible despite the fact they didn't do that with the non-retina. The DVD drive drove that choice... which is potentially different enough to warrant different choices.

While that last sentence might seem insignificant, I'm personally would stick to the TB2 version INDEFINITELY if they kept it -- and kept the replaceable SSD drive... as I think it's absolutely insulting and infuriating that they SOLDERED the TB3 version's SSD. AB. SURD!!
 
I have a GTX 780 in stock -- I could compare that to a 980 and a 980 Ti -- what should I use to evaluate the comparative performance in the most demonstrable manner?

I don't run any specific app to test a card. I open a big project in PP or AE and see how the card handles previews after adding all kinds of changes to the original footage. If the card can show a stutter free preview I am happy. Also time to encode final project is a motivator for me. AE will really put the stress on a card. I only use it for small short pieces like intros but if i ever decide to expand to longer pieces I would be looking for a faster card with more onboard memory.

Obviously there are all kinds of stats on all three cards. Here is a chart that gives a comparison:

http://www.hardwaresecrets.com/nvidia-geforce-chips-comparison-table/

Lisa
 
As far as Apple's game plan to stick with ATI -- that's impossible to be "indefinite." We merely have no knowledge of when they'll change. Given that they have switched from each each 2-3 years has caused us a false sense of confidence that their purchasing will remain static. ALL Apple is doing is manipulating nVidia to lower their prices. Maybe nVidia didn't cover a GPU issue properly... maybe their GPU failure rate was irrational with respect to their speed advantage and liability. The [only] thing we know -- is that Apple switched from nVidia in 2014 to mid-2015... and subsequently [appear] to have doubled down by using the ATI in the new form-factor MBPr with the TB3 (which I look forward to discussing with everyone).

That in NO WAY precludes Apple from switching back to nVidia in a decision to not only continue the previous form-factor of the MBPr (with TB2) with subsequent GPU revisions that COULD switch to nVidia.

Indefinite by definition is an unknown amount of time. So we're not disagreeing there.
They will come around at some point and use Nvidia again, but it seems that the tech industry in general is not very satisfied with NVidia's aggressiveness. And that includes Apple.

Nvidia is pushing in a direction which is not compatible to what Apple is doing. Their wants don't line up with Apple's. Nvidia is pushing for CUDA among other things and Apple is showing zero interest in following that.
And from what I'm hearing from programmers in different industries there's zero driver development and bug fixing for Nvidia drivers, on the mac side.
So I imagine it will probably take longer than 3 years for the stars to align and Apple use Nvidia again.

No one knows for sure but it's an educated guess from what I read and see daily.

Vega supposedly is going to be more powerful than what Nvidia produces now.
But of course this is PR talk, we will know for sure in the next few weeks.
As for the drivers I have no clue if these cards will be supported but if Apple is even remotely serious about the pro market (probably not) they will try to support these cards actively. Don't want to get your hopes up because Apple in general is not too strong in the graphics card support.
 
What bugs are those and what is the source that there are zero plans to fix them?

I'm also curious about this, as most (all?) of the bugs I've seen people complaining about here have actually been fixed in the NVIDIA web drivers for quite some time.
 
  • Like
Reactions: stevekr
The bugs are not related to OS glitches, but to deeper problems on how the system operates.
So for example there are a lot of 3d developers implementing their 3d renderers who have hit a wall when trying to implement Open CL specific workflows to Nvidia cards. This has to do with the fact that Apple has abandoned basically Open CL, and coupled with the fact that they're not planning to support Nvidia any time soon people have to scratch their heads on how to circumvent those problems.

Otoy, if I remember correctly mentioned in a lot of blog posts their problems with moving their renderer to Open CL. They basically abandoned the effort because they hit immense roadblock and tried to use Metal which they also found was more advanced on iOs than the Mac.
And these aren't the only developers voicing their problems and concerns.
 
iv not read all the comments but most thos apps (all apart from resolve) are mostly CPU dependent.
resolve is the only one that is relay GPU dependent.

which versions of adobe are you using CC is openCL ok

i alaway post a link to this https://www.pugetsystems.com/all_articles.php when people ask about pro apps, they have artcles on both CPU speed/cores GPU benches etc in pro apps CS6 cpu/gpu benches CC benches etc

also adobe has publicly mentioned a lot that CUDA is dead, there moving away from it more and more..

if you relay want a fast box for work >.> might be worth looking at windows too
 
Indefinite by definition is an unknown amount of time. So we're not disagreeing there.
They will come around at some point and use Nvidia again, but it seems that the tech industry in general is not very satisfied with NVidia's aggressiveness. And that includes Apple.

Nvidia is pushing in a direction which is not compatible to what Apple is doing. Their wants don't line up with Apple's. Nvidia is pushing for CUDA among other things and Apple is showing zero interest in following that.
And from what I'm hearing from programmers in different industries there's zero driver development and bug fixing for Nvidia drivers, on the mac side.
So I imagine it will probably take longer than 3 years for the stars to align and Apple use Nvidia again.

No one knows for sure but it's an educated guess from what I read and see daily.

Vega supposedly is going to be more powerful than what Nvidia produces now.
But of course this is PR talk, we will know for sure in the next few weeks.
As for the drivers I have no clue if these cards will be supported but if Apple is even remotely serious about the pro market (probably not) they will try to support these cards actively. Don't want to get your hopes up because Apple in general is not too strong in the graphics card support.


Phobos -- I like how you write and think buddy. Thanks for the input. You're extremely knowledgeable.

I'm disappointed to say, but I was thinking three years as well.

What, however, is Vega?
 
Vega is AMD's new line of graphic cards. It's not out yet but hopefully we will see them in Apple's new Macs.
 
It's hard to say if Tim Cook will acknowledge AMD as their mac pro provider. Looking at how number of articles stated that he got 8 million dollars as last year income due to his incompetency...probably putting all his hope in iphones and not mac pros and others.
 
WONDERFUL info guys.

I have access to 980Ti, 980, 680 -- and would be willing to buy some cards based on recommendations of you guys here so that I can provide quick demonstrations for people.

I did read a thread in which it's explained that the 7950 corresponds to one of the R series cards and so on. Memorizing the way the nomenclature lines up will take me a little time to memorize, but I'll get that.

Does anyone here have recommendations of how to demonstrate the trade offs of the cards? I.e., show the lackluster performance of a Titan X in FCP X vs. the optimal R8 setup, and so on?

Some comparisons are here FCPX: AMD vs NVIDIA: https://forums.macrumors.com/threads/fcpx-amd-vs-nvidia.1956128/
 
ill have to have a look for the adobe blog thing about it, CUDA is not dead but is being fazed out from adobe.
there's an adobe blog article some where listing what is accelerated (not much) by the gpu and which things need CUDA/openCL (openCL did most things that CUDA did if not all now).
there are some plugins that do need CUDA but if you had them you know already.

AE has replaced a lot of it's CUDA accelerated parts with cinema 4D in CC

CUDA only did a few specialist parts of work
 
Be careful that Maxwell card may shows glitches in Adobe apps.
Not familiar with DaVinci, but are you sure it works with CUDA?

Never seen glitches in Adobe apps w/ Maxwell Titan-X cards w/ the back heatspreader option.
Yes, I agree putting half of the GDDR memory on the backside with ZERO cooling was a dumb idea.
The only time I have seen nVidia glitches is with cards overheating and dying.

AMD cards, thats a different story.
I've seen many glitches, although replacing the thermal paste(crud) on AMD cards w/ Arctic Silver works wonders.

Yes Davinci Resolve uses and prefers CUDA.

investing in FCP-X workflows is playing with fire.
Its use is tied to macOS and Apple isn't interested in making workstations anymore.

Sure the new 2017 (i)Mac maxed out will be pretty sweet (and costly), but after 2017...
...what can one buy from Apple to run it? an A12X based Mac?
 
I'm a PP user, pretty sure I will abandon OSX once my Mac Pro 5,1 runs out of juice.

Windows here we come!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.