Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There has been much discussion about the drivers being a key factor in determining if these are truly workstation cards. I would venture a guess that AMD did NOT just give away the FirePro name (and brand) to Apple. I suspect in addition to just licensing fees there was some driver development written into the contract. We will probably never know, but AMD could easily have responsibility for the Windows drivers and they could also be working hand-in-hand with Apple on the OS X drivers.

I hope it turns out to be more collaborative because it would be a win-win for both parties. I also really want to see Mantle adopted on OS X, because I think it would open up some real opportunities for the platform.
 
The key substantive differentiators in the the Windows world between consumer and workstation cards are a) whether the drivers are tuned for gaming performance or for performance with professional applications and b) whether professional application vendors qualify and officially support their apps on particular models of card.

Well, it seems almost certain that the OS X drivers that Apple and AMD likely co-developed for the GPUs in these machines are indeed tuned for pro apps, not games, and pro app developers on the Mac platform will certainly qualify these cards for use with their apps and officially support their apps on them. So, you are indeed getting the benefits that some customers are willing to pay lots of money for in the Windows world. (And then some; because hardware configurations are so much more limited with the new Mac Pro than in the Wintel world, app vendors themselves will likely focus more effort on optimizing their apps for these GPUs on the Mac than they generally would for any specific model of GPU in the Windows world.)

Of course, on the Windows side, there's vigorous debate over whether these benefits are really worth the price of admission, and the answer to that question varies from user to user and app to app. Apple, however, has made this whole issue go away on the Mac with an interesting compromise: you don't have the option of cheap consumer cards anymore, but you get the most substantive benefits of workstation cards at a much lower price now.
 
Folks, here's the deal.

FirePros are custom versions of consumer GPU's. Or, consumer GPU's are custom versions of FirePros. What differs is the memory controllers (memory bandwidth), clocks, ECC VRAM (sometimes), and possibly other small hardware changes. Likely also the FirePro are special binned parts, meaning they test at the high end (stable) of the spectrum from the production line. The markup you are paying is for the custom drivers, which are for a particular market and are much more rigorously tested and written (one hopes).

The D300, D500 and D700 aren't exactly FirePro silicon, they are yet another derivative, but surely they trace their lineage through the FirePro line. What about the software? Well, it's Apple software obviously, and Apple most often provides software free. So from the beginning I've been saying it makes no sense to call the nMP D line the same as the FirePro, and they certainly won't cost the same. That would be ridiculous for apple to charge $3,400 for each D700. And indeed they don't.

However, the D700/W9000 is based from the 7970 6GB. How much do those cost? About $850 for the 6GB versions, or $1,700 a pair. AFAIK the D700 pair are costing us about $2,000 total, and they are workstation class (hopefully with ECC VRAM).

So what do we have? All together as a package yes the nMP is a deal compared to HP & Dell workstations. The one thing you give up is upgradable GPU's, and internal cards and drives.

Thats probably the best description so far on workstations graphic cards are actually about. They might be based on the desktop version, but the drivers and hardware is changed enough to make them better for what workstations are designed to do. That along with better stability and better support.
 

Excluding the dual-socket CPU, the Z820 is nowhere near the specs of the top-end Mac Pro. My point was that if you add up the hardware and buy them as a consumer, you can't actually make your own for cheaper.

As you can see, the only real debate is the consumer/workstation GPU. But then there's been exactly the same debate for years now. The truth is that workstation GPUs are built for stability, and although they often share near identical components, there is a difference.

If Apple's D700 is the FirePro W9000 (and I'm betting it is), the Mac Pro is an excellent deal.
 
First of all, your prices are too high. Looking at some UK retailers, I see the Xeon 2697 v2 available for about £1800. This is considerably cheaper than the £3000 Apple is charging for them. (£2800 upgrade plus £200+ for the quad-core it ships with) The prices Apple charge for the CPU upgrades are outrageous.

And if you're building your own system, you can simply buy two 6-core processors for a considerably lower price, and get higher performance, as those processors are clocked higher. You're not limited to the single socket Apple has restricted the new Mac Pro design to.


You're right that few places seem to be offering 16GB DIMMs - they seemed to have disappeared a year or so ago. Apple has never been competitive on memory pricing, so I'm sure that will change very quickly.


It's true that Apple's flash prices are very competitive - but if that's the best deal, there's nothing to stop you buying one of their SSDs and putting it in your PC.


It's looking very likely that the D700 cards are not FirePro W9000 cards, but rebranded 7970 cards instead.
I see no mention of ECC memory on the Apple website, which is the main reason the FirePro cards cost so much.


Cases, motherboards and power supplies are not expensive components, they barely factor into it.
 
I've just noticed this on the Scan site - it looks a bit like a Windows version of the Apple D700 - AMD FirePro S9000 Server Graphics Card - 6GB.

http://www.scan.co.uk/products/6gb-...-30-(x16)-384-bit-gddr5-displayport-passive-c

The price is much less than the W9000 at £1710.86 Inc VAT.
And it's still using ECC memory. Until we know one way or the other, for the prices that Apple are charging for the D700, it should be assumed that they are not - especially when they are marking up the CPU prices so much.
 
This is the incorrect bit. However, the new Mac Pro is pretty good value in comparison to other similarly specced workstations.

This is pretty much always true when a new Mac Pro is released it's only when that particular model is reaching end of life that the Windows PC workstations start looking much cheaper because of their more rapid product cycles.
 
I think the real question is who needs dual D500's or D700's ?

If we were to build the same system elsewhere we'd probably only include a single consumer GPU instead like a GTX 780 Ti or if you want that 6GB of VRAM a GTX Titan. Both leagues cheaper the V9000.

Now there is something you all need to realise which some other posters have already touched on so I don't mean to talk down to you or anything like that.

The HD 7970 is the same as the D700. It's the same GPU die. What's different? Apple put more RAM on it and put it on a custom printed circuit board.

But what about software? Well the reason people don't buy pro cards for OS X is because Apple supplies the graphics drivers. When you buy a FireGL for a Windows system you use AMD's drivers which unlocks better OpenGL performance for the cards in professional software like Maya and so on.

But in OS X that is not the case. Apple supplies the drivers.

So if you were going to build a Hackintosh what GPU's are you going to use? You'd have to be a complete idiot to get V9000's because the HD 7970 is not only the same GPU it uses the same drivers as the D700's do included with the Mac Pro. What does this mean? It means you get the same performance without the price premium.

They have obviously done a deal with AMD to use HD 7970 GPU's with pro (or semi-pro) level drivers.

Can you build a system cheaper than the Mac Pro? er yes, and if you want the pro level graphics drivers just put Mavericks on the system you build with the HD 7970's.
 
I agree that I think the D700 is a hybrid, somewhere between the a 7970 6GB version.. and a W9000.

When I look at some of the spec charts this almost looks like a 7960 :)

Just because you can trick the system profiler into calling cards Dxxx this or that doesn't hold much weight to me. Its just reporting whatever some ktext file is telling it, not evaluating the physical hardware.

Frankly if this is a W9000 I would not want to pay full price for it because it has been down-clocked in multiple ways likely to fit within the nMP's power and thermal envelope.

We have to see performance reviews to see where the D700 lands relative to a 7970 and a W9000, to really see what its value is..

Also.. by my math the D700's cost about $1400 (for 2).. the D700 is a $1000 add to the D300.. which I guess I'm saying is a $200 card x2 of them.
 
I agree that I think the D700 is a hybrid, somewhere between the a 7970 6GB version.. and a W9000.

When I look at some of the spec charts this almost looks like a 7960 :)

Just because you can trick the system profiler into calling cards Dxxx this or that doesn't hold much weight to me.

Frankly if this is a W9000 I would not want to pay full price for it because it has been down-clocked in multiple ways likely to fit withing the nMP's power and thermal envelope.

We have to see performance reviews to see where the D700 lands relative to a 7970 and a W9000, to really see what its value is..

Also.. by my math the D700's cost about $1400 (for 2).. the D700 is a $1000 add to the D300.. which I guess I'm saying is a $200 card x2 of them.

I don't think you're understanding. The only point of buying a pro card is drivers. And the drivers work perfectly on the HD 7970 meaning you get all the extra pro features on a consumer non-pro card. It is not a simple trick in system profiler (that is a visual change only with no benefit) this is literally running the same graphics driver on the consumer card and gaining all the benefits.
 
I don't think you're understanding. The only point of buying a pro card is drivers. And the drivers work perfectly on the HD 7970 meaning you get all the extra pro features on a consumer non-pro card. It is not a simple trick in system profiler (that is a visual change only with no benefit) this is literally running the same graphics driver on the consumer card and gaining all the benefits.

Ya..benchmarking will answer all this. Would be interesting to see a MP 5,1 with a 7970 vs. a nMP with a D700.

Also, do you see why I make the 7960 comment? On the first page, we see that that two 7970's would equal 7.6 Tflops, not 7 Tflops like the two D700's based on apple claims.

Also you see that two 7970 6gb editions would cost you $1700.. so my $1400 estimate seems in-line with the D700's being 2 down-clocked 7970's with pro drivers - which completely change the rendering behavior of the card 180 degrees BTW compared to the gaming drivers.
 
Last edited:
Ya..benchmarking will answer all this. Would be interesting to see a MP 5,1 with a 7970 vs. a nMP with a D700.

Also, do you see why I make the 7960 comment? On the first page, we see that that two 7970's would equal 7.6 Tflops, not 7 Tflops like the two D700's based on apple claims.

Yeah I did, I think Apple probably underclocked it a little to get better thermals. The HD 7970 is kind of a wishy-washy card in general having been released twice by AMD, the 2nd version the 1GHz version is the more recent one, the GPU's are the same but AMD tends to mess with the frequency when they don't have new silicon ready to go against NVIDIAs releases which is what occurred earlier in the year.

The new Mac Pro sure is one thing, interesting. It's like a mystery to solve.
 
I don't think you're understanding. The only point of buying a pro card is drivers. And the drivers work perfectly on the HD 7970 meaning you get all the extra pro features on a consumer non-pro card. It is not a simple trick in system profiler (that is a visual change only with no benefit) this is literally running the same graphics driver on the consumer card and gaining all the benefits.
No - the main reason to buy a pro card is for ECC memory. The drivers are of secondary importance. It looks like the D700 does not have ECC memory - meaning it's not really a pro card - which would explain the price.
On the Windows side of things, you also have access to 10-bit output in pro applications, but I'm fairly certain that's really a software restriction too.

With Nvidia things are a little bit different, and they have cards with 12GB RAM on board which is twice what the consumer cards offer.
 
I don't think you're understanding. The only point of buying a pro card is drivers. And the drivers work perfectly on the HD 7970 meaning you get all the extra pro features on a consumer non-pro card. It is not a simple trick in system profiler (that is a visual change only with no benefit) this is literally running the same graphics driver on the consumer card and gaining all the benefits.

You can also frequently flash retail gaming cards to identify as workstation cards and therefore take advantage of workstation drivers under Windows. But the problem is, you're not getting "all the benefits" this way, because the other big benefit of running on a pro GPU is official support, both from the GPU vendor and from your pro app vendors. You're not going to get that with a flashed card, and you're sure as hell not going to get it with with a Hackintosh.

No - the main reason to buy a pro card is for ECC memory. The drivers are of secondary importance. It looks like the D700 does not have ECC memory - meaning it's not really a pro card - which would explain the price.

The ECC scheme in the W9000 is described in various places as 'virtual', so I'm not sure it's adding any real hard costs there. Additionally, ECC is primarily of interest to e.g. engineers running simulations on the GPU, where you need confidence your results are exactly correct down to however many decimal places. It's typically of little value to creative pros, a far more relevant market for this system. Most creative pro workflows operate on the basis of 'if it looks right, it is right' and nobody loses sleep worrying whether some subtle inaccuracy in the result might have been introduced by a cosmic ray flipping a bit somewhere.
 
You can also frequently flash retail gaming cards to identify as workstation cards and therefore take advantage of workstation drivers under Windows. But the problem is, you're not getting "all the benefits" this way, because the other big benefit of running on a pro GPU is official support, both from the GPU vendor and from your pro app vendors. You're not going to get that with a flashed card, and you're sure as hell not going to get it with with a Hackintosh.

Obviously you're not going to get support but when was the last time you called up the maker of some pro software you use to get support? For me that's never and I am a creative professional. Maybe I'm just lucky but things "just work" for me and always have done. Even when I was flashing NVIDIA 6800's in to Quadros almost a decade ago.
 
I think people are missing the point here. There is no reason to build the same spec for the same money. You should build the better spec with the same money.

Absolutely no reason to get AMD GPU over NVidia and potential Maximus setup for real GPU computing.

It would be kind of fun to see how much would cost some Exteme Ivy Bridge-E paired with Quadro K5000+Tesla, 32GB RAM and 500GB 840 EVO stacked up on high end LG2011 Supermicro MOBO, Seasonic PSU, some Noctua cooling and all that put inside of Supermicro chassis.
 
Absolutely no reason to get AMD GPU over NVidia and potential Maximus setup for real GPU computing.

It would be kind of fun to see how much would cost some Exteme Ivy Bridge-E paired with Quadro K5000+Tesla, 32GB RAM and 500GB 840 EVO stacked up on high end LG2011 Supermicro MOBO, Seasonic PSU, some Noctua cooling and all that put inside of Supermicro chassis.

Isn't it more fun to see how much compute power you can get for your $ ? :)

There certainly is a reason to go with amd; they're not artificially limiting their ggpu performance through drivers and/or lasering off half the silicon. You don't have much choice with a cuda workload or a certified system but seriously, did I read only one post in this thread mentioning vendor support as the key difference between consumer/workstation brand cards? Have the people that dismiss this and claim 'stability' and ggpu performance differentiates consumer/workstation models ever worked on large multi-gpu systems/code?

Look at all the number crunching homebrew systems out there, the 7950 and 7970/280x absolutely destroy everything else on the market in power/cost comparisons. If 7 cards attached with riser cables to a single bargain bin board slung randomly onto a desk can run at full load 24/7 for months on end, how is stability an argument? Also, I don't think a 14 year old fps gamer should expect (logically or legally) to have a gpu that's somehow less stable because he's not running the workstation branded equivalent.

While threads like the OP aren't really incorrect in claiming that an off the shelf, part for part comparison against big vendor system looks like sensible value, on an enthusiast's technology forum like this, it seems very silly to me :)
 
This is just wrong.

The video cards are consumer cards rebadged as "FirePros."

The Radeon 7950 shows up as a D700 in OSX. Go look up how much those cost.

The difference between the consumer cards and the workstation cards is basically the driver. And AMD does not do the drivers for the Mac Pro. You are getting consumer cards that Apple is marketing as "workstation" because they happen to be the same silicon that AMD uses, yet you are not getting AMDs driver.

It's a MASSIVELY profitable machine, and incredibly expensive. I don't believe the second GPU can even be used for graphics.

Other than the VRAM. Oops.
 
Isn't it more fun to see how much compute power you can get for your $ ? :)

There certainly is a reason to go with amd; they're not artificially limiting their ggpu performance through drivers and/or lasering off half the silicon. You don't have much choice with a cuda workload or a certified system but seriously, did I read only one post in this thread mentioning vendor support as the key difference between consumer/workstation brand cards? Have the people that dismiss this and claim 'stability' and ggpu performance differentiates consumer/workstation models ever worked on large multi-gpu systems/code?

Look at all the number crunching homebrew systems out there, the 7950 and 7970/280x absolutely destroy everything else on the market in power/cost comparisons. If 7 cards attached with riser cables to a single bargain bin board slung randomly onto a desk can run at full load 24/7 for months on end, how is stability an argument? Also, I don't think a 14 year old fps gamer should expect (logically or legally) to have a gpu that's somehow less stable because he's not running the workstation branded equivalent.

While threads like the OP aren't really incorrect in claiming that an off the shelf, part for part comparison against big vendor system looks like sensible value, on an enthusiast's technology forum like this, it seems very silly to me :)

Nothing to do with OT or your reply but you got me curious now, what kind of GPU pipeline do you have at your studio or workplace and what work do you do on it. Geekiness in me would love to know.
 
Nothing to do with OT or your reply but you got me curious now, what kind of GPU pipeline do you have at your studio or workplace and what work do you do on it. Geekiness in me would love to know.

I don't have a single workplace, nor does it *really* involve any computers :)
We do the cooling systems for Cray's UK installations/moves (and the odd IBM/NEC when their engineers get it wrong :)) along with a fair few custom setups. Considering the average customer (even Weather systems are EU government/MoD), specifying hardware will probably have me taken away by black-suited men in the middle of the night, but GPU systems from vendors tend to be solely nvidia.
What's really interesting is talking with the engineers about how supercomputing has changed over recent years (to their detriment) - Ignoring previous investment (mainly in code), support and the scaling allowed from their interconnects, my previous example of a cheap motherboard with a load of consumer gpus slung on a table isn't really far off what you can buy from a supercomputing firm if you've got a budget in the 10s of millions.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.