Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Seems like it, however 666sheep posted screen grabs of Maverics identifying the 7870XT as the D500, which seems pretty conclusive.

Yeah, as far as I know, all that says is that a 7870XT and D500 share a device ID... It doesn't imply anything about the actual capabilities of the two devices (which is evidenced by the fact that the two have different memory controllers).

Well this discussion makes my decision harder. Obviously I'd like to get away with a $3k base unit. OTOH, I wouldn't mind a machine that kicks butt in the few games I play too. Oh, and I'd like to start using OpenCL for some modeling software I'm writing ... I have no idea what to get. Should I get base, push up the CPU line, or push up the GPU line?

As I said, perhaps the added memory bandwidth opens up some extra performance for the D500. It also sounds like you may regret getting a pair of D300s, but wouldn't likely regret spending $500 more on the D500s... so... :)
 
Yeah, as far as I know, all that says is that a 7870XT and D500 share a device ID... It doesn't imply anything about the actual capabilities of the two devices (which is evidenced by the fact that the two have different memory controllers).

Fair point



As I said, perhaps the added memory bandwidth opens up some extra performance for the D500. It also sounds like you may regret getting a pair of D300s, but wouldn't likely regret spending $500 more on the D500s... so... :)

Right, probably so. And getting the 4k hex core isn't that much further :)

Proof is in the pudding and Apple getting to do what they like best, which is carefully balancing system components. Any way you cut it I'd have to believe the D300s aren't a step up all around.
 
Yeah, as far as I know, all that says is that a 7870XT and D500 share a device ID...

Device ID cannot be shared by 2 different devices because it's mean to be unique for each one. First nMP teardowns will tell the truth about what these cards really are.
 
Device ID cannot be shared by 2 different devices because it's mean to be unique for each one. First nMP teardowns will tell the truth about what these cards really are.

Meant to be, but if you are hiding development of a new GPU in a new product and had control of the OS...

Chipworks will find out in the end for sure but I reckon it's the 'smoking gun' :D
 
Meant to be, but if you are hiding development of a new GPU in a new product and had control of the OS...

Chipworks will find out in the end for sure but I reckon it's the 'smoking gun' :D

All true, however there's not much to hide since they told us detailed specs. Ultimately it's little matter, as VR said calling it one or the other seems to be semantics, and not true actually. It looks like they took the lineage of one or the other and did create a different variation.
 
All true, however there's not much to hide since they told us detailed specs. Ultimately it's little matter, as VR said calling it one or the other seems to be semantics, and not true actually. It looks like they took the lineage of one or the other and did create a different variation.

Exactly.

Device ID cannot be shared by 2 different devices because it's mean to be unique for each one. First nMP teardowns will tell the truth about what these cards really are.

Yeah, but what I mean is that, a 7870 may appear in Mavericks as a D300, but I bet in 10.7 or 10.8 it appears as a 7870. The fact is we know the D500 has a wider memory bus than a 7870 (per Apples specs) so the fact that the driver recognizes a 7870 as a D500 only means it has "something" in common with a 7870... not "everything" in common.
 
Last edited:
Meant to be, but if you are hiding development of a new GPU in a new product and had control of the OS...

This was my initial thought when I fired up flashed 7870XT in 10.9. But second one was, that there's actually nothing to hide (as Cubemmal mentioned).

Yeah, but what I mean is that, a 7870 may appear in Mavericks as a D300, but I bet in 10.7 or 10.8 it appears as a 7870. The fact is we know the D300 has a wider memory bus than a 7870 (per Apples specs) so the fact that the driver recognizes a 7870 as a D300 only means it has "something" in common with a 7870... not "everything" in common.

You meant D500, right? D300 and D700 are perfectly matching their 7xxx counterparts. I'm seeing this a little different, but there's actually not much to debate until they'll start to ship and we will have first hand info.
 
It turned out after nMP launch, that D500 is mutilated Tahiti XT2, not the 7950 nor 7870XT. But it still uses 7870XT DID. nMP EFI update contains PC BIOSes and EFI ROMs for all Dxx cards, so it was easy to discover.
7970 and 7870 XT are no longer recognized as Dxx since 10.9.2 due to drivers and OpenCL frameworks update.
 
It turned out after nMP launch, that D500 is mutilated Tahiti XT2, not the 7950 nor 7870XT. But it still uses 7870XT DID. nMP EFI update contains PC BIOSes and EFI ROMs for all Dxx cards, so it was easy to discover.
7970 and 7870 XT are no longer recognized as Dxx since 10.9.2 due to drivers and OpenCL frameworks update.

Well that's interesting. So it's a dumbed down 7970. Makes sense actually since it simplifies the product line, and makes the D500 more of sweet spot and maybe a good deal too.
 
It turned out after nMP launch, that D500 is mutilated Tahiti XT2, not the 7950 nor 7870XT. But it still uses 7870XT DID. nMP EFI update contains PC BIOSes and EFI ROMs for all Dxx cards, so it was easy to discover.
7970 and 7870 XT are no longer recognized as Dxx since 10.9.2 due to drivers and OpenCL frameworks update.

wait then what are the d300s? are they the same but lower clocked?
 
Last edited:
Well that's interesting. So it's a dumbed down 7970. Makes sense actually since it simplifies the product line, and makes the D500 more of sweet spot and maybe a good deal too.

Still interesting to see if AMD will be offering them through their distribution network or not.

Of course it is Apple, they could have asked for a special variant of the Tahiti anyhow.

Though I would have expected them to ask for a special packaging variation in addition while they were at it. Since lidless, low profile, etc. seems to be the norm they always need.
 
Little comment on the recent revelation that the D300 does not have the faster operation for high accuracy calculations as the D500 and D700 do?
 
Last edited:
Little comment on the recent revelation that the D300 does not have the faster operation for high accuracy calculations as the D500 and D700 do?

What exactly are you saying?

That the D300 only has 1/16th Double Precision performance? It's still better than using the processor for functions that just as easily can use GPGPU resources.
 
Little comment on the recent revelation that the D300 does not have the faster operation for high accuracy calculations as the D500 and D700 do?

It's not unexpected I think. Apple has to differentiate the line up.
 
D300:
– GPU 800 @1125 mV (Boost 850 MHz @1175 mV), memory 1270 MHz (5080 effective), TDP 116W

D500:
– GPU 650 @1025 mV (Boost 725 MHz @1075 mV), memory 1270 MHz (5080 effective), TDP 108W

D700:
– GPU 650 @918 mV (Boost 850 MHz @1100 mV), memory 1370 MHz (5480 effective), TDP 108W

If these clocks posted by 666sheep are correct it's gonna be significantly slower then a D300 at pretty much anything but double precision compute.
 
If these clocks posted by 666sheep are correct it's gonna be significantly slower then a D300 at pretty much anything but double precision compute.

Help me with this. The 500 and 700 are Tahiti where the 300 is the older Pitcairn architecture. Also I don't see the clocks in the post above, is that what you mean by "GPU 650"? That is "650 MHz"? Under any load it'll boost to 125 MHz less than the D300, but it has the more advanced architecture, wider memory bus, and more stream processors. I'm pretty sure it'll spank the D300.
 
Help me with this. The 500 and 700 are Tahiti where the 300 is the older Pitcairn architecture. Also I don't see the clocks in the post above, is that what you mean by "GPU 650"? That is "650 MHz"? Under any load it'll boost to 125 MHz less than the D300, but it has the more advanced architecture, wider memory bus, and more stream processors. I'm pretty sure it'll spank the D300.

Yup those are mhz.
Tahiti really was build with compute in mind. Pitcairn is much more effective architecture for things like games. That's why the highest end mobile amd gpu's are pitcairn chips and not tahiti.

For more information read this review by toms hardware.
http://www.tomshardware.com/reviews/workstation-graphics-card-gaming,3425-2.html

The w8000 is also a gimped tahiti chip. The exact performance of the d500 remains to be seen as they cut back on different areas, but i suspect benchmarks will confirm this in a couple of days. Then again i may be wrong :)
 
Yup those are mhz.
Tahiti really was build with compute in mind. Pitcairn is much more effective architecture for things like games.

I'll read that article, but all the AnandTech stuff I read (by Ryan <something>) is that GCN is about Computer AND graphics/gaming performance. The drivers also make a huge difference obviously. For example, the only game I play is Lord of the Rings Online. On my MP in Windows with a 7950 it had horrible stuttering at any settings under Eyefinity. Highly annoying, I had to go to single monitor. Well they've been working on the micro stuttering (this looked more like Macro stuttering to me), and guess what? In the last release from a week or two ago it works perfectly now. Instead of going for max FPS they have implemented frame pacing and it looks like a brand new card now. Oddly the FPS doesn't seem that much different either (40-60 fps) so I don't know what happened. Regardless I think AMD missed the boat in their goal of getting max FPS. Stuttering is psychologically worse than getting a slightly slower FPS.
 
Yup those are mhz.
Tahiti really was build with compute in mind. Pitcairn is much more effective architecture for things like games. That's why the highest end mobile amd gpu's are pitcairn chips and not tahiti.

For more information read this review by toms hardware.
http://www.tomshardware.com/reviews/workstation-graphics-card-gaming,3425-2.html

The w8000 is also a gimped tahiti chip. The exact performance of the d500 remains to be seen as they cut back on different areas, but i suspect benchmarks will confirm this in a couple of days. Then again i may be wrong :)

So will the d300 be better than the d500 and d700 in games?
 
So will the d300 be better than the d500 and d700 in games?

No certainly not the D700, that will be significantly faster.

But with the D500 it might be very close. Apple could always make a crappier driver for the D300 ;)
I'm not paying 400$ extra for a D500 until is see some benchmarks. Which should be this week as the mac pro's are already being shipped.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.