Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacVidCards

Suspended
Original poster
Nov 17, 2008
6,096
1,056
Hollywood, CA
Thanks to a post by 666sheep who noticed this.

I dug out a 7870 I had put together with an EFI rom courtesy of Netkas' help.

As 666 had noted, it got id'd as FirePro D500.

So likely that nMP card will have same core and a device id very near this one.

Everyone wants to believe these workstation GPUs get special core chips made with Unicorn Hair. Sadly the Unicorn flew off and now they are all made of same silicon on same lines and with same design as retail gamer chips. A few features are laser snipped off sometimes, but sometimes just by hard coded device id.

I emphasize again, WORKSTATION GPU CHIPS ARE SAME DESIGN AS RETAIL GAMER CHIPS, differences are created via laser cuts and device id s which are changed via tiny 10K or 40K resistors in various positions. I'm not saying that there is anything wrong with this, it's how they pay for R & D for a chip once and sell different variants to different markets. Just don't believe that your $300 7970 is worth $3,300 because you placed a resistor differently. (Wink to Cupertino)

Especially on a mac where nobody is going to bother writing two different drivers.

Anyhow, specs don't match perfectly but no chance that this isn't a D500 core.

D300 is also mentioned in drivers. Notice that D700 is right above Radeon 7950, just as in real life.

So, maybe there is a corresponding list of device id s in the driver.

But anyone with Mavericks installed can have a look.

S/L/E/AMDRadeonX4000GLDriver.bundle

then open and up and look inside at AMDRadeonX4000GLDriver:

Radeon HD Tahiti XT PrototypeRadeon HD - FirePro D700Radeon HD 7950Radeon HD - FirePro D500Radeon HD Aruba XT PrototypeRadeon HD Aruba PRO PrototypeRadeon HD Tahiti Unknown PrototypeRadeon HD Pitcairn XT PrototypeRadeon HD Pitcairn PRO PrototypeRadeon HD - FirePro D300Radeon HD Wimbledon XT PrototypeRadeon HD Neptune XT PrototypeRadeon HD Pitcairn Unknown PrototypeRadeon HD Verde XT Prototype


So, same kext runs them all.

Here was kext as it was in 10.8.5:

Radeon HD Tahiti XT PrototypeRadeon HD 7950Radeon HD Tahiti PRO PrototypeRadeon HD Aruba XT PrototypeRadeon HD Aruba PRO PrototypeRadeon HD Tahiti Unknown PrototypeRadeon HD Pitcairn XT PrototypeRadeon HD Pitcairn PRO PrototypeRadeon HD Wimbledon XT PrototypeRadeon HD Neptune XT PrototypeRadeon HD Pitcairn Unknown PrototypeRadeon HD Verde XT Prototype

7870 doesn't match specs listed, looks like they downclocked the RAM a bit while adding some. Other numbers aren't an exact match either but, this is the core.

I can also tell you that 7770 ISN'T D300. And FWIW, it isn't 6950 either. Anyone have a 6970 to check? Maybe 7750. Several other cards to test. One will be close enough to show up as D300.
 

Attachments

  • Screen Shot 2013-10-23 at 5.35.33 PM.png
    Screen Shot 2013-10-23 at 5.35.33 PM.png
    116.1 KB · Views: 636
  • Screen Shot 2013-10-23 at 5.34.09 PM.png
    Screen Shot 2013-10-23 at 5.34.09 PM.png
    1 MB · Views: 608
  • Screen Shot 2013-10-23 at 5.16.18 PM.png
    Screen Shot 2013-10-23 at 5.16.18 PM.png
    140.7 KB · Views: 852
  • 7870.gif
    7870.gif
    21 KB · Views: 549
Last edited:
Especially on a mac where nobody is going to bother writing two different drivers.

Can't emphasise this enough (well there are some more font size options I guess). It wasn't until the Quadro 4000 that there were any optimisations made for OS X and they don't even compare to those found in Windows.

Fire Pro is just branding as far as nearly every OS X user is going to be concerned. Great on Windows for those who benefit from the better drivers, support and certification amongst a wide world of manufacturers. Apple control all that anyway when it comes to Macs so it doesn't matter if these are FirePro or Radeon, except they can now charge you more.

Yeah I know they have ECC too, but that benefits niches with niches.
 
Don't forget that FirePro's use ECC memory where as the gamer cards don't. So they are not completely the same, then again no ECC memory costs 3000$.
 
You've alluded to this, but the D500 has a 384-bit memory bus vs. the 256-bit bus on the 7870. Plus, the 7870 only has 1280 stream processors vs. the D500's 1526 (1536?) stream processors. I think the D500 is actually more closely related to the 7870XT.
 
Last edited:
You've alluded to this, but the D500 has a 384-bit memory bus vs. the 256-bit bus on the 7870.

True, but it's not a big problem for AMD to add (back) third memory controller to the GPU, especially when it's designed to run three of them (Tahiti Pro and XT). LE just has one removed, hence 256-bit bus vs 384-bit on Pro and XT. LE is called "Tahiti Pro Prototype" not without a purpose.

Plus, the 7870 only has 1280 stream processors vs. the D500's 1526 (1536?) stream processors. I think the D500 is actually more closely related to the 7870XT.

If you'll look closer at MVC's pics, you'll see that he means 7870XT -> that card is on his screenshots, not the Pitcairn based 7870.
 
Last edited:
We need someone with one to try and see.

Or some good with IDA might be able to make sense of the file I mentioned. I imagine that somewhere else are pointers to those ASCCI text locations.
 
Fire Pro is just branding as far as nearly every OS X user is going to be concerned. Great on Windows for those who benefit from the better drivers, support and certification amongst a wide world of manufacturers. Apple control all that anyway when it comes to Macs so it doesn't matter if these are FirePro or Radeon, except they can now charge you more.

Macs come with BootCamp. So is Apple not going to make their FirePro cards not also be differentiated in Windows also ?

There are some apps that only certified against "Pro cards" that have no OS X instances. While probably won't have specific Mac Pro 2013 + FirePro certification if the cert requires whole system evaluation, but Mac Pro users are probably going to get "Pro drivers" with the system they buy.

Yes it help Apple to look skinny if stand next to "fat lady" Pro graphics card profit mark ups, but not totally detached from the Windows oriented benefits either.

----------

Iirc, first batch of 7870, not the GHz edition, was based on tahiti le.

" ... AMD's next-generation R9 270X will reportedly feature a similar "Tahiti LE" to the Radeon HD 7950 and HD 7870 limited edition. ... "
http://www.tomshardware.com/news/r9-270x-radeon-tahiti-le,24460.html

The Pitcarin XT 7870 is close to the D300 in specs and the Tahiti LE 7870 is close the D500.

http://en.wikipedia.org/wiki/Radeon_HD_7000_Series#Southern_Islands_.28HD_7xxx.29_Series

So it is going to depend upon which one of those two going to plug in. The W7000 is a variant on the Pitcarin variant. Wouldn't be surprising if this Tahiti LE variant was tweaked version of the variant Tahiti XT Apple was using in D700.
 
Last edited:
Yup, it looks like the D500 fits between the 7870 XT (1536 cores) and the 7950 (384-bit bus & 3GB GDDR5). Honestly that's better than I expected, and there's two of 'em.
 
Seems like more of an opportunity to siphon more dollars out of enterprise customers than anything. CIO tells upper management they need X amount of dollars to spend on "high end" workstations to get work done; AP department rolls their eyes and authorizes the big checks.
 
Everyone wants to believe these workstation GPUs get special core chips made with Unicorn Hair. Sadly the Unicorn flew off and now they are all made of same silicon on same lines and with same design as retail gamer chips. A few features are laser snipped off sometimes, but sometimes just by hard coded device id.

So I've been googling my little heart out about this issue and I've found 2 points of view: Totally uninformed opinions about "oh, workstation cards are optimized" (it's got electrolytes!), and others saying "it's the same thing, just 10x the price."

Is there any authoritative article on the subject which can definitively show that apart from the price, the driver discrimination, and the ECC, there's no substantive difference between workstation and retail cards?
 
Is there any authoritative article on the subject which can definitively show that apart from the price, the driver discrimination, and the ECC, there's no substantive difference between workstation and retail cards?

AMD Senior Marketing Manager Alexis Mather describes the actual differences in this interview.

It appears most of the cost difference is due to AMD having people work closely with ISVs and to provide 24/7 phone support.

The interview is from 2009, so it might not apply to current generation cards and engineering practices.
 
  • Like
Reactions: linuxcooldude
So I've been googling my little heart out about this issue and I've found 2 points of view: Totally uninformed opinions about "oh, workstation cards are optimized" (it's got electrolytes!), and others saying "it's the same thing, just 10x the price."

Is there any authoritative article on the subject which can definitively show that apart from the price, the driver discrimination, and the ECC, there's no substantive difference between workstation and retail cards?

When it comes to workstation cards.
You are paying for the drivers (R&D) and testing. (totally different(opposite)behavior from how a gaming card renders the screen. Catered to rendering DDC and CAD apps. And working gracefully with large data sets.. a gaming card is much less efficient with its rendering style (drivers).
You are paying for the certification. (testing that the card is compatible with a miriad of 3d and CAD apps. Look at any mid-to-highend software and notice there is a list of recommend and certified graphics cards. This is the result of testing the cards with every new version of the software and making sure the drivers preform well. Which is likely back and forth between the software vender and AMD.
You are paying for more VRAM Up until recently you'd only see more than 2GB of memory on a workstation cards. Right now you only see 6 on Titan and W9000, but 8 and 12 are coming.
You are paying for ECC VRAM. This eliminates artifact and floating point errors. CAD apps defiantly want this where every millimeter matters or the application may be open for days straight.
You are paying for technical support - as Umbongo added.
 
Last edited:
....
It appears most of the cost difference is due to AMD having people work closely with ISVs and to provide 24/7 phone support.

There in lies the rub with Apple trying to wrap themselves with the "Pro card" flag. One, Apple is quite know for taking a minimalistic approach to allocating people to projects. Are a significantly large number ISV getting deep introspection and high touch collaboration on their graphics pipelines support from Apple? Two is there going to be a fast track support queue of knowledgeable for these verticals ( if don't do the first, it is going to be highly problematic to effectively do this second thing. ). This 24/7 support has to beyond primarily just having friendly people who will read from a scripted conversation dialog. ( I know of no one who wants to effectively pay $100-200/year for support that consist of "Bob" from Inidia(or anywhere else) mechanically reading from a fixed script. Warm bodies online 24/7 doesn't cut it for those kinds of prices. ) Warranty? Chopped down to Apple's minimalistic 1 year. Longer parts replacement availability? Nope.

It will be an odd duck value proposition when the better support queue, broader certifications , more highly customized drivers ,etc are when have flipped your Mac Pro on in Windows mode. Apple can charge Windows pro card prices if they fill in the same set of supplementary value points. Stamping "Designed by Apple in California" on the hardware isn't one of those.


The additional people and certification costs are no where near the level the mark-ups are. It is a path to charging more and higher margins.
 
When it comes to workstation cards.
....
You are paying for more VRAM Up until recently you'd only see more than 2GB of memory on a workstation cards. Right now you only see 6 on Titan and W9000, but 8 and 12 are coming.

The problem for these Mac Pro cards is that Apple gutted the VRAM

D300 (2GB) relative W7000 (4GB) down 50%
D500 (3GB) relative W8000 (4GB) down 25%

There is already huge percentage increase over the mainstream cards to help pay for this VRAM increase. Stripping the VRAM here is far more just plainly to line Apple's pockets than delivering additional value to the customer.


You are paying for ECC VRAM. This eliminates artifact and floating point errors.

closer to just paying for more VRAM to hold the ECC parity data. ( over-provisioned past the rated user useable memory. ). This is coupled to the above where Apple cranks back the user usable memory, this ends up being a saved cost here also.
 
You are paying for ECC VRAM. This eliminates artifact and floating point errors. CAD apps defiantly want this where every millimeter matters or the application may be open for days straight.

… Totally uninformed opinions about "oh, workstation cards are optimized" (it's got electrolytes!), and others saying …

Now, that's priceless in one thread. :)

Floating point errors occur due to the limited precision of every CPU or GPU. Using ECC to avoid floating point errors is like … drinking whisky to avoid the common cold.

CAD applications store their actual model geometry different than they display this geometry. Even if the visualization may be one pixel off the correct position, your 3D-model is still pretty exact.
And since your CNC manufacturing data is based on the 3D-model and not on the display output, a non pro GPU will never be the cause for missing the mark by one millimeter.

By the way, does anybody bother if your CNC milling machine uses ECC in it's internal storage? Why bother with ECC if you use ECC only during a small part of the way your data moves from idea to final product?
 
The problem for these Mac Pro cards is that Apple gutted the VRAM

D300 (2GB) relative W7000 (4GB) down 50%
D500 (3GB) relative W8000 (4GB) down 25%

There is already huge percentage increase over the mainstream cards to help pay for this VRAM increase. Stripping the VRAM here is far more just plainly to line Apple's pockets than delivering additional value to the customer.

Sure but you are getting 2 of them.. so in theory if the VRAM is seen as one combined pool you are getting with the 2 D300's 4GB 1:1 with more processing power .. and with the 2 D500's 6GB = 150% and 12 GB with the D700's.

At the end of the day this is what Apple is offering. If you don't like it your options are 2010/2012 MP, imac, Hackintosh, PC. Its clear that the majority of people have no clue why workstation cards exist, what the different drivers actually do (its not like the Governor on a golf cart - it is a totally different rendering process), and have no use for a workstation card having never used one before. Its a world of improvement for some in their applications, and a slightly inferior experience in WOW that could have been had for hundreds of dollars less for others.
 
You know, the irony may be that they will sell like hot cakes.....to Windowsfolks looking to skimp on FirePro costs. They will erase the PCIE drive, load Windows 8, and be assimilated.

I don't think anyone believes that the top end nMP will literally charge you $7,000 for the dual d700s. Even removing $600 for trade in of d300s from equation, the $6,400 markup would mean nobody would buy them.

Esp so if using OSX where there aren't these "optimized drivers" that exist in Windows.

Could end up the ultimate Windows render machine.
 
You know, the irony may be that they will sell like hot cakes.....to Windowsfolks looking to skimp on FirePro costs. They will erase the PCIE drive, load Windows 8, and be assimilated.

I don't think anyone believes that the top end nMP will literally charge you $7,000 for the dual d700s. Even removing $600 for trade in of d300s from equation, the $6,400 markup would mean nobody would buy them.

Esp so if using OSX where there aren't these "optimized drivers" that exist in Windows.

Could end up the ultimate Windows render machine.

I wanted to get the d700s .. but the prices that people are saying is making me think i will just go with the d500s since i know the price of them with the hex
 
Sure but you are getting 2 of them.. so in theory if the VRAM is seen as one combined pool you are getting with the 2 D300's 4GB 1:1 with more processing power ..

The VRAM isn't a combined pool (this notion that Apple has some permanent Crossfire mode on all the time with both card doesn't have much stubstance to it). If computation is on one's GPUs cores pointing at the off card VRAM is about as far away is pointing at the RAM the CPU package is connected to. Might as well point at that too and say its 16GB of RAM. It isn't.

That's why in the real market for computation focused cards they come with REAL, not virtual, RAM.

The second card brings another set of cores that probably need data feed into them to show value. Two W7000s would have 8GB of VRAM if want to play what if we virtual weaved these together game.

This isn't an issue if Apple respectively dropped the price. If you get less and pay less that is not a problem. But pay more and get less... that's is a huge value proposition problem. At 2GB, this is exactly the same as what 7870 GHz editions cards ship with. There is zero value differentiation. Can wave hands about two cards... if both of the two have value problems then have doubled down on the problem. That isn't part of the solution; paying twice as much for 4GB is not a good thing.


At the end of the day this is what Apple is offering. If you don't like it your options are 2010/2012 MP, imac, Hackintosh, PC.

There are always options to consider. If Apple's D300 is really just exactly the same as mainstream 7870 card they are going to find it extremely difficult to get folks to upgrade out of their 2009-2012 Mac Pros. Even more than a few folks with 2008's.

Its clear that the majority of people have no clue why workstation cards exist,

The problem for Apple is that they aren't trying to sell these to the majority of people. Folks who buy workstations tend to know about workstation cards. The distribution of knowledge inside the market is what matters.
 
You know, the irony may be that they will sell like hot cakes.....to Windowsfolks looking to skimp on FirePro costs. They will erase the PCIE drive, load Windows 8, and be assimilated.

I don't think anyone believes that the top end nMP will literally charge you $7,000 for the dual d700s. Even removing $600 for trade in of d300s from equation, the $6,400 markup would mean nobody would buy them.

Esp so if using OSX where there aren't these "optimized drivers" that exist in Windows.

Could end up the ultimate Windows render machine.

Do you really think AMD would allow that? They'd be having their lunch eaten. I'm sure it'll be priced accordingly--4 grand maybe?

Or, maybe the thing just wont be recognized by AMD's FirePro windows drivers, a death-knell to anyone who uses bootcamp.
 
Or, maybe the thing just wont be recognized by AMD's FirePro windows drivers, a death-knell to anyone who uses bootcamp.

And that could take the concept of "proprietary" to a new level.

And the fans will defend it.....

And it's why I "just say No" to anyone who wants to buy an Apple to run Linux or Windows. (Yes, I block the Purchase Order.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.