Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Are we kind of being set up again by Apple giving us outdated or low performance cards in a very expensive computer?

Apple isn't guiltless but it is also timing/efforts of the GPU vendors also. If released in Q1 2013 this (with E5 v1 and TB v1 ) wouldn't have been good. It wasn't ready. So Apple is out of the release synchronization with the GPU vendors with the Mac Pro. Not really a new problem.

When the NMP was announced there was allot of talk about how powerful and awesome the graphics cards where going to be...

If price insensitive it still is powerful. Whether $/performance is there is debatable but just top end performance it isn't that bad ( if not looking at it AMD fanboy vs Nvidia fanboy perspective). With the "sneak peak" Apple only laid down details on the extreme top end ( what they call D700 cards now). They are more powerful than most other graphics cards out there.


Reason i ask is that even with the previous MP the graphic card options especially in the base configuration where outdated and cheap in comparission with the cost of the machine....is history repeating itself with the new macpro?

It is not so much history as the customers repeating the same issues. Far more about "price points" than outdated/cheap .
 
What I'm curious about though is if because we are getting a dual GPU setup, if that means we are actually seeing a combined pool of Vram.. so lets say I'm using a program that uses OpenCl.. say I have two D500's.. does the App actually see 6 GB of vram to process with... not just 3 (even though 2 W7000's would be 8GB total) The fact there no longer needs to be a physical link between the cards to crossfire is encouraging. Apple knowing you have two cards in there it would be crazy not to share everything, and just have the 2nd card for thunderbolt lanes :p.
 
So Apple is out of the release synchronization with the GPU vendors with the Mac Pro. Not really a new problem.

Yeah, but it's compounded by the lack of removability. Basically we're getting 2 w7000 with slashed ram-- Less than $1000 (retail) on the components, and I'm sure apple is getting the 2 chips for less than $750.. With the base specs, this is a ripoff. For $1999 or even $2250 I would have considered this-- it already had so many cons re: expandability and cost, lack of cuda etc etc..

I am THE demographic for this product. I bought Mac Pros in the past, I make video for a living, I work off a *very* similar workflow now-- a macbook pro with thunderbolt monitor/raid, thunderbolt video out device etc etc..

I work for a company that would spend this much on a computer if I said so.. But I'd rather have a PC at this point. I can get SO much more for the money.

It would be hard for them to find a better market for their product than me, and I'm not doing it. This was badly planned out.. No, you can't innovate, Tim.

The 2006 mac pro brought me into the Apple fold. I stayed for the iphone. Within the last year I switched to android, and now it looks like I have to switch to PC, which is a shame, but I want a powerhouse workstation, and I'm not going to pay that much more to stay apple.
 
The D300 256-bit, 1280 "cores", and 2 GLOPs performance match up with the W7000

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#FirePro_Workstation_Series

Although the Scrooge McDuck move by Apple is gutting these back to just 2GB of VRAM instead of 4GB.

But I strongly believe that Phil was flashing "D300" as if it meant something when he full well knows it means nothing whatsoever right now.


So basically the GPU options start from a D300 which is a neutered W7000 (half the RAM), which clocks like a 7870.

The high end is a D700 (which Apple refuses to release the price of), which is like a W9000, which clocks like a 7970.

Pretty much what we expected, right? (apart from the price)
 
So basically the GPU options start from a D300 which is a neutered W7000 (half the RAM), which clocks like a 7870.

The high end is a D700 (which Apple refuses to release the price of), which is like a W9000, which clocks like a 7970.

Pretty much what we expected, right? (apart from the price)

This is looking accurate based on a LuxMark database I just found that has some D700 results in it. It slots right in at a 7970's power (that's one GPU btw; no one knows how this will work in Mac Pro with two.)

http://www.luxrender.net/luxmark/top/top20/Room/GPU/1
 
So basically the GPU options start from a D300 which is a neutered W7000 (half the RAM), which clocks like a 7870.

The high end is a D700 (which Apple refuses to release the price of), which is like a W9000, which clocks like a 7970.

Pretty much what we expected, right? (apart from the price)

What I'm curious about is another thread where someone quoted TDP of 450W.

How is that going to work?

I am curious if 274W x 2 & CPU is ever going to fit in that without some serious throttling.

The D300 could also be a W5000 with some shader clusters turned back on.

All about memory bandwidth at that point. W7000 has much more. More like Apple to hand out the doubly gimped W5000.
 
This is looking accurate based on a LuxMark database I just found that has some D700 results in it. It slots right in at a 7970's power (that's one GPU btw; no one knows how this will work in Mac Pro with two.)

http://www.luxrender.net/luxmark/top/top20/Room/GPU/1

Well, that is to say, D700 is the new product of FirePro?
I think D300/D500/D700 is only rename and custom edition of current firepro sku.
 
So basically the GPU options start from a D300 which is a neutered W7000 (half the RAM), which clocks like a 7870.

The high end is a D700 (which Apple refuses to release the price of), which is like a W9000, which clocks like a 7970.

Pretty much what we expected, right? (apart from the price)

I thought its was;

D300 = W5000
D500 = W7000
D700 = W9000
 
I thought its was;

D300 = W5000
D500 = W7000
D700 = W9000

I think, according to the steam processors, list below
stream processors VRAM memory bus memory bandwidth single precision
W5000/7000 768/1280 2/4 GB 256-bit 153.6/102.4 GB/s 1.3/2.4 TFLOPS
D300 1280 2 GB 256-bit 160 GB/s 2 TFLOPS

stream processors VRAM memory bus memory bandwidth single precision
W8000 1792 4 GB 256-bit 176 GB/s 3.23 TFLOPS
D500 1526 3 GB 386-bit 240 GB/s 2.2 TFLOPS

stream processors VRAM memory bus memory bandwidth single precision
W9000 2048 6 GB 386-bit 264 GB/s 4 TFLOPS
D700 2048 6 GB 386-bit 264 GB/s 3.5 TFLOPS
 
So basically the GPU options start from a D300 which is a neutered W7000 (half the RAM), which clocks like a 7870.
....
Pretty much what we expected, right? (apart from the price)

There has been a faction here who have promoted the W5000 even though it doesn't support the number of display outputs that the Mac Pro needs ( 7 ) to make all the video outputs operational. That also drinks the AMD FirePRo kool-aid that the W5000 is a "mid-range" card ( far closer to being an entry level card. Has trouble competing with iMac BTO GPUs. )

But yes Apple starting the line up with a mid-range card and going up from there is what would be reasonably be expected given their moves over the last 4-5 years.
 
VERY disingenuous.

Even more than Nvidia's Quadro cards, FirePro means "regular card marked up by 300-400% for no discernible performance increase".


Pro cards aren't (only) about performance increases.

They are about being supported by the hardware OEM in business apps. Nvidia Quadros for example have had a history of having tweaked drivers/firmware on the side of image quality over performance (anti-aliased line-drawing for example was one of those features the quadro had long ago that the geforce did not at the time).

In the Windows world, if you are running a professional app like, say MineCAD, Surpac, SoldWorks, etc. on a consumer grade card and it gives display bugs, you are on your own. Similarly, if you have hacked a consumer card to run in your Mac Pro (for example) and call a software vendor complaining about display bugs in a pro app, have a guess what level of support you will receive?

And I've seen it happen before - using a Geforce card on a CAD application and getting display problems because of it. Will it always happen? Of course not. But it is not guaranteed to work.

The extra you pay is for application support and actual quality drivers and firmware, rather than a rushed-out "get 5% more FPS in Call of Battlefield Duty 17!" (at the cost of stability and/or image quality).
 
Just install two 7970 into 12-core 2009-2012 mp today and you have same power as new macpro.

Will need external power tho.
 
Pro cards aren't (only) about performance increases.

They are about being supported by the hardware OEM in business apps. Nvidia Quadros for example have had a history of having tweaked drivers/firmware on the side of image quality over performance (anti-aliased line-drawing for example was one of those features the quadro had long ago that the geforce did not at the time).

In the Windows world
, if you are running a professional app like, say MineCAD, Surpac, SoldWorks, etc. on a consumer grade card and it gives display bugs, you are on your own. Similarly, if you have hacked a consumer card to run in your Mac Pro (for example) and call a software vendor complaining about display bugs in a pro app, have a guess what level of support you will receive?

And I've seen it happen before - using a Geforce card on a CAD application and getting display problems because of it. Will it always happen? Of course not. But it is not guaranteed to work.

The extra you pay is for application support and actual quality drivers and firmware, rather than a rushed-out "get 5% more FPS in Call of Battlefield Duty 17!" (at the cost of stability and/or image quality).

In OSX, same drivers. Put a 7970 in 10.9 and see. And your knowledge of firmware is lacking as well.

Don't confuse PR with reality.

I can make regular cards run as Quadro and vice versa in OSX. Child's play.

But in Windows world, hard coded device id rules drivers.
 
Crossfire would work in windows though, right? One of the reasons I'd want a mac pro is for gaming. I have 50 million other reasons, but it does need to do gaming well, as well as other graphically intensive stuff programming wise....

I didn't get crossfire to work with windows. It doesn't boot at all when i put 2 7970 cards with crossfire and i have external power supply.
 
The D300 seems to be based on the 7870 but at a lower clock, which makes sense due to the constraints of the thermal model.

Looks like the D300 is a V7900
http://www.amd.com/us/products/workstation/graphics/ati-firepro-3d/v7900/Pages/v7900.aspx#4

GPU
Stream Processors: 1280
Memory Interface: 256-bit
Memory
Size/Type: 2GB GDDR5
Bandwidth (GB/s): 160.0
Display Outputs
DisplayPort: 4 Standard
Max Resolution: 2560x1600 @ 60Hz


http://www.downloadatoz.com/howto/new-amd-firepro-v7900-v5900-cayman-based-4-3-displays,24676.html

MD FirePro V7900 is based on Cayman Pro GL with 1280 stream processors which is obviously less than Radeon HD 6950. Otherwise there are 725MHz core clock, 1.86 single TFlops, 0.464 double TFlops, 256 bit 2GB GDDR5 memory size, 5000MHz Equivalent Frequency and 160GB/s memory BW. Moreover FirePro V7900 has four DisplayPort output ports supporting DP 1.2/HDMI 1.4a specs, 4 Displays and 3 Dimensional output at most.


Review of V7900
http://www.phoronix.com/scan.php?page=article&item=amd_firepro_v7900&num=1

So it looks like 464Gflops DP each for (928DP Gflops) 4Tflops Single precision 32bit.
 
Last edited:
The D300 seems to be based on the 7870 but at a lower clock, which makes sense due to the constraints of the thermal model.

Looks like the D300 is a V7900
http://www.amd.com/us/products/workstation/graphics/ati-firepro-3d/v7900/Pages/v7900.aspx#4

GPU
Stream Processors: 1280
Memory Interface: 256-bit
Memory
Size/Type: 2GB GDDR5
Bandwidth (GB/s): 160.0
Display Outputs
DisplayPort: 4 Standard
Max Resolution: 2560x1600 @ 60Hz


http://www.downloadatoz.com/howto/new-amd-firepro-v7900-v5900-cayman-based-4-3-displays,24676.html

MD FirePro V7900 is based on Cayman Pro GL with 1280 stream processors which is obviously less than Radeon HD 6950. Otherwise there are 725MHz core clock, 1.86 single TFlops, 0.464 double TFlops, 256 bit 2GB GDDR5 memory size, 5000MHz Equivalent Frequency and 160GB/s memory BW. Moreover FirePro V7900 has four DisplayPort output ports supporting DP 1.2/HDMI 1.4a specs, 4 Displays and 3 Dimensional output at most.


Review of V7900
http://www.phoronix.com/scan.php?page=article&item=amd_firepro_v7900&num=1

So it looks like 464Gflops DP each for (928DP Gflops) 4Tflops Single precision 32bit.

Oh, it seemed so bad...
before you post I consist D300/D500/D700 is the special edition of AMD FirePro W5000/7000/8000/9000, I think Apple combine W5000 and W7000 to D300, and then W8000 -> D500, W9000 ->D700
But the Info you provided, is another explain of D300, It seems also interesting.

----------

The D300 seems to be based on the 7870 but at a lower clock, which makes sense due to the constraints of the thermal model.

Looks like the D300 is a V7900
http://www.amd.com/us/products/workstation/graphics/ati-firepro-3d/v7900/Pages/v7900.aspx#4

GPU
Stream Processors: 1280
Memory Interface: 256-bit
Memory
Size/Type: 2GB GDDR5
Bandwidth (GB/s): 160.0
Display Outputs
DisplayPort: 4 Standard
Max Resolution: 2560x1600 @ 60Hz


http://www.downloadatoz.com/howto/new-amd-firepro-v7900-v5900-cayman-based-4-3-displays,24676.html

MD FirePro V7900 is based on Cayman Pro GL with 1280 stream processors which is obviously less than Radeon HD 6950. Otherwise there are 725MHz core clock, 1.86 single TFlops, 0.464 double TFlops, 256 bit 2GB GDDR5 memory size, 5000MHz Equivalent Frequency and 160GB/s memory BW. Moreover FirePro V7900 has four DisplayPort output ports supporting DP 1.2/HDMI 1.4a specs, 4 Displays and 3 Dimensional output at most.


Review of V7900
http://www.phoronix.com/scan.php?page=article&item=amd_firepro_v7900&num=1

So it looks like 464Gflops DP each for (928DP Gflops) 4Tflops Single precision 32bit.

No! there is another question, from the chart of AMD FirePro Production Comparison(http://www.amd.com/us/products/workstation/graphics/ati-firepro-3d/Pages/product-comparison.aspx) It show that V7900 ONLY support up to 1600P(NOT 4K!)
So the D300 is also a rename and costom edition of W7000, Altough W7000 is 2.4T more than D500 2.2T.
 
I suspect that comparing the D300/D500/D700 to current models will not yield accurate results.

They are probably new cards based on the just released architecture and will use Mantle to implement CrossFire.

That's my guess. It fits with the MacPro being delayed and not getting any specifics (such that they are) until after AMD made their product announcement.

When it comes to Quadro vs Geforce and FirePro vs Radeon I can see why one would suspect that the workstation cards are nothing more than relabelled, higher priced versions of their desktop class counterparts. Its the same argument made when discussing Xeon vs Pentium or Xeon vs i7. While I am certainly not an expert in the matter the fact is that there is a distinct difference between the two. The desktop class graphics cards and CPUs are more flexible in the types of tasks they can tackle and appear to tackle most of that much wider range of tasks with gusto. However, the workstation cards are designed for a very narrow band of tasks and handle those specific tasks better than their desktop counterparts. They fall to pieces when confronted with gaming and desktop tasks because they were not designed for that. They can build games, not play them.

I also have no doubt that the hardware is very similar. But I suspect that if you did manage to flash the firmware of a Quadro to make it behave like a GeForce what you would effectively be doing is making certain features unavailable and altering the way the controllers use the remaining components. And if you did flash a GeForce to make it a Quadro I suspect there would be something missing.

For the most part this distinction is intangible except to those who utilize workstation cards to perform those specific tasks they were designed for. The distinction is further hidden within benchmark comparisons because most benchmarking software tests are based on desktop compute tasks, like gaming and whatnot, rather than workstation tasks. I suspect the results of these tests to be meaningless. Comparing the benchmarks of a Quadro to a GeForce probably, in reality, won't tell you the whole story. In order to compare them apples-to-apples you would have to find common ground and I suspect that the only common ground between the two are factors that are relatively meaningless.

Now keep in mind that my perspective is not necessarily that of the user, but rather that of the technician spec'ing out and deploying the machine to users.

That's just my piece.
 
I suspect that comparing the D300/D500/D700 to current models will not yield accurate results.

They are probably new cards based on the just released architecture and will use Mantle to implement CrossFire.

That's my guess. It fits with the MacPro being delayed and not getting any specifics (such that they are) until after AMD made their product announcement.

When it comes to Quadro vs Geforce and FirePro vs Radeon I can see why one would suspect that the workstation cards are nothing more than relabelled, higher priced versions of their desktop class counterparts. Its the same argument made when discussing Xeon vs Pentium or Xeon vs i7. While I am certainly not an expert in the matter the fact is that there is a distinct difference between the two. The desktop class graphics cards and CPUs are more flexible in the types of tasks they can tackle and appear to tackle most of that much wider range of tasks with gusto. However, the workstation cards are designed for a very narrow band of tasks and handle those specific tasks better than their desktop counterparts. They fall to pieces when confronted with gaming and desktop tasks because they were not designed for that. They can build games, not play them.

I also have no doubt that the hardware is very similar. But I suspect that if you did manage to flash the firmware of a Quadro to make it behave like a GeForce what you would effectively be doing is making certain features unavailable and altering the way the controllers use the remaining components. And if you did flash a GeForce to make it a Quadro I suspect there would be something missing.

For the most part this distinction is intangible except to those who utilize workstation cards to perform those specific tasks they were designed for. The distinction is further hidden within benchmark comparisons because most benchmarking software tests are based on desktop compute tasks, like gaming and whatnot, rather than workstation tasks. I suspect the results of these tests to be meaningless. Comparing the benchmarks of a Quadro to a GeForce probably, in reality, won't tell you the whole story. In order to compare them apples-to-apples you would have to find common ground and I suspect that the only common ground between the two are factors that are relatively meaningless.

Now keep in mind that my perspective is not necessarily that of the user, but rather that of the technician spec'ing out and deploying the machine to users.

That's just my piece.

According to the single precision performance, D500 is behind W7000, D700 is behind W9000.

I don't think FirePro based Hawaii(new R9) is behind Tahiti(current W series)

There are two possible reason:
1. for the cooling system and controlling power, AMD must slow the new FirePro's performance;
2. FirePro D series is just the rename and further customised edition of Tahiti.

I think the probability of Situation 2 is high, according to the #32, at June, 2013, the D700 have already had benchmark data and the performance of D series is all behind W series and cut too much.

By the way, the naming of D series is very simliar to Radeon Sky Series, Sky 500/700/900.

Of course, we are now in the conjecture, only when we get the nMP, we can run the SPECviewperf to get benchmark compared with other Workstation Graphic Card.

Cheers,

Shawn
 
TrueAudio doesn't do any AD or DA conversations at all. It is a straight digital to digital conversion.

If Apple puts a real API that developers can commonly use over the hardware then it will be more than a gimmick. If every single app developer has to do a custom DSP kernel then it would be a gimmick.

Long term audio and video processing is likely to get integrated. With every larger transistor budgets it isn't that hard to throw multiples functions into one integrated, lower cost component.

I'm well aware what TrueAudio is and how it functions. It still doesn't eliminate the fact it's a cheap gimmick meant to offer a crutch to game developers who haven't been arsed to code a proper sound engine since the turn of the millennium. It encourages laziness rather than recognizing the brilliant sound design some developers put into their work, and I want nothing of it.

Also, I doubt the re-integration of unique functionality any time soon. People want a small, all-purpose device that they can configure to their particular needs. See the Mac Pro, Mac Mini, and soaring popularity of laptops and tablets as evidence.
 
By the way, the naming of D series is very simliar to Radeon Sky Series, Sky 500/700/900.
You're probably on to something there. D500 and D700 specs are close to the Sky 500 & 700. The Sky 900 on the other hand is a dual GPU card, so the D900 isn't really comparable.

http://www.anandtech.com/show/6867/amd-announces-radeon-sky-family-of-servercloud-video-cards
These are passively cooled cards intended to be sold directly to cloud gaming providers, and despite the Radeon name are not consumer cards.
 
They are probably new cards based on the just released architecture and will use Mantle to implement CrossFire.

There will be no Crossfire on the Mac Pro under OS X.

However, if Apple decided to go that route, they already have the underpinnings to add it without CrossFire. An app can manually implement Crossfire right now if it chooses to. No need for Mantle.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.