Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Mine should be arriving tomorrow. I plan on installing Windows and running games for testing. So, we should know soon.

Could you please confirm if the drivers are FirePro drivers? I'm planning to use it for 3D CAD as well which prefers to run on workstation drivers (OpenGL optimized GPU) over gaming GPU (DirectX)
 
There was sarcasm there.

Crossfire v single titan.

Point being this is suppose to be 2x D700, which people are comparing to W9000 and its just beating a single Q6000 (titan)

Im glad there is finally a mac with decent game potential, but one need to compare apples with apples.


Ahh, that old chestnut. My enthuisasm was based on workstation cards being used for gaming. If I was to compare apples with apples then I would go up against Quadro's really as that is closer to a real comparison. The fact the mac pro competes and beats a Titan in a full DirectX benchmark is like finding out your tractor has a secondary sports car engine. It is really good news for us that expected this machine (based on history) to perform subpar in gaming.

And it isn't just pure gaming here. I am a 3D Artist (CAD/OpenCL) who uses Unity3D (Gaming engine) for heavy virtual reality environments and this single machine is capable of replacing two computers now. My Mac and an above average PC gaming rig. I just hope OS switching is nice and fast too with the quick PCI-e SSD's.

When the in-depth tests come out, we need a GPU monitor running so we can see the utilization of the hardware, make sure crossfire is working as shown in the youtube benchmark vid. Personally 60 to 90 FPS on any game at 1080p would be sweet. Also the higher clock cpu's, quad/hex as that should add more fps than the 8/12 core.


Anim
 
Ahh, that old chestnut. My enthuisasm was based on workstation cards being used for gaming. If I was to compare apples with apples then I would go up against Quadro's really as that is closer to a real comparison.

Why would you just compare quadros? That's just a word/brand.
The only things we should be interested in is performance and cost relative to personal usage scenarios surely?
 
Why would you just compare quadros? That's just a word/brand.
The only things we should be interested in is performance and cost relative to personal usage scenarios surely?

Workstation card vs competitive workstation card. Comparing workstation card to gaming card isn't usually a good test as the drivers are optimised for different things. Also gaming cards have gimped FPGPU acceleration and is how Nvidia and AMD make their money.

If we flip this around and test a gaming card in a workstation environment then their is no comparison, the workstation card will usually stomp all over a gaming card. The same is supposed to be the other way, the gaming card will stomp all over the workstation card when it comes to games. What we see here though is a smaller gap, the D700 should be average at games but it is showing much more than that.

I have always wondered why the top end cards can't be good for both gaming and pro work and have dual drivers for whatever it is you are doing.
 
Ahh, that old chestnut. My enthuisasm was based on workstation cards being used for gaming. If I was to compare apples with apples then I would go up against Quadro's really as that is closer to a real comparison. The fact the mac pro competes and beats a Titan in a full DirectX benchmark is like finding out your tractor has a secondary sports car engine. It is really good news for us that expected this machine (based on history) to perform subpar in gaming.

And it isn't just pure gaming here. I am a 3D Artist (CAD/OpenCL) who uses Unity3D (Gaming engine) for heavy virtual reality environments and this single machine is capable of replacing two computers now. My Mac and an above average PC gaming rig. I just hope OS switching is nice and fast too with the quick PCI-e SSD's.

When the in-depth tests come out, we need a GPU monitor running so we can see the utilization of the hardware, make sure crossfire is working as shown in the youtube benchmark vid. Personally 60 to 90 FPS on any game at 1080p would be sweet. Also the higher clock cpu's, quad/hex as that should add more fps than the 8/12 core.


Anim

Ummmm no it's just shows how damn good the Quadro is at gaming. As your tractor needs 2 sports engines ;)

I'm just glad the Mac Pro is finally a decent gaming rig as a secondary, but I am under no illusion that a single D700 can touch a Titan in gaming.

----------

Workstation card vs competitive workstation card. Comparing workstation card to gaming card isn't usually a good test as the drivers are optimised for different things. Also gaming cards have gimped FPGPU acceleration and is how Nvidia and AMD make their money.

If we flip this around and test a gaming card in a workstation environment then their is no comparison, the workstation card will usually stomp all over a gaming card. The same is supposed to be the other way, the gaming card will stomp all over the workstation card when it comes to games. What we see here though is a smaller gap, the D700 should be average at games but it is showing much more than that.

I have always wondered why the top end cards can't be good for both gaming and pro work and have dual drivers for whatever it is you are doing.

I'd put money on a Titan to beat a D700 in both.

Also these tests were done under windows right? I am certain there are no optimised d700 drivers under windows, so what drivers are they using?
 
Ummmm no it's just shows how damn good the Quadro is at gaming. As your tractor needs 2 sports engines ;)

I'm just glad the Mac Pro is finally a decent gaming rig as a secondary, but I am under no illusion that a single D700 can touch a Titan in gaming.

I agree, I didn't mean to say a single card was as good as a Titan, not sure I said that but could have been read that way. Like you, just happy my new mac will be better than my current gaming machine which I can pass to a relative or something. And every Quadro I have had was pants at gaming. But thats going back a few years, not sure what the current ones are like.
 
I agree, I didn't mean to say a single card was as good as a Titan, not sure I said that but could have been read that way. Like you, just happy my new mac will be better than my current gaming machine which I can pass to a relative or something. And every Quadro I have had was pants at gaming. But thats going back a few years, not sure what the current ones are like.

I had a huge smile when I learned that crossfire worked in windows. Stocked to have a Mac Pro I can also use for gaming as a secondary function .

Good times.

Also if one games the Hexcore is the sweet spot . 6 cores for workstation tasks and fast at single core tasks.
 
Workstation card vs competitive workstation card. Comparing workstation card to gaming card isn't usually a good test as the drivers are optimised for different things. Also gaming cards have gimped FPGPU acceleration and is how Nvidia and AMD make their money.

If we flip this around and test a gaming card in a workstation environment then their is no comparison, the workstation card will usually stomp all over a gaming card. The same is supposed to be the other way, the gaming card will stomp all over the workstation card when it comes to games. What we see here though is a smaller gap, the D700 should be average at games but it is showing much more than that.

I have always wondered why the top end cards can't be good for both gaming and pro work and have dual drivers for whatever it is you are doing.

There's less gimping of chips now then ever before, it's not always the case and like I said it depends on your usage.
I built some 7x 7970 number crunching machines a while ago, testing showed that amd gaming cards were faster than firepros that were far faster than quadros that were far faster than gtx - They were also cheap and the performance/cost comparison was a no-brainer.
That's kind of a 'workstation environment'? I'm not at the mercy of software switches in certain big-name applications, turning off features dependent on drivers, it's just a massively parallel ggpu workload that pegs everything it can take. It's also a completely unbiased benchmark for similar workloads.

The drivers aren't so much optimised for specific jobs, it's more along the lines of selective gimping. As people continue to get hung up on workstation branding, the more this cycle of driver/application/benchmark crippling will continue.

I understand that some industries might have to use application X that requires certified systems with support etcetc and big business forces them to pay through the nose for it. What confuses me is how this applies to comparisons on a tech forum full of enthusiasts.
 
There's less gimping of chips now then ever before, it's not always the case and like I said it depends on your usage.
I built some 7x 7970 number crunching machines a while ago, testing showed that amd gaming cards were faster than firepros that were far faster than quadros that were far faster than gtx - They were also cheap and the performance/cost comparison was a no-brainer.
That's kind of a 'workstation environment'? I'm not at the mercy of software switches in certain big-name applications, turning off features dependent on drivers, it's just a massively parallel ggpu workload that pegs everything it can take. It's also a completely unbiased benchmark for similar workloads.

The drivers aren't so much optimised for specific jobs, it's more along the lines of selective gimping. As people continue to get hung up on workstation branding, the more this cycle of driver/application/benchmark crippling will continue.

I understand that some industries might have to use application X that requires certified systems with support etcetc and big business forces them to pay through the nose for it. What confuses me is how this applies to comparisons on a tech forum full of enthusiasts.

Was that a bitcoin mining farm by any chance?

To me, and correct me if I am wrong here that 'Workstation' to most people means stability. And in business that is crucial. I have cut corners in the past using gaming cards for pro work and paid the penalty, e.g artefacts appearing in renderings that simply should not be there, re-render the frames and the artefacts vanish for no apparent reason. Render the same content using a workstation card and it just churns through them without any surprises. When deadlines are looming you just have to bite the bullet and pay the extra for the reliability. Or, on another note trying to use DirectX drivers for CAD work and getting all sorts of issues with textures screwing up or viewport geometry going crazy with incorrect depth sorting.

I don't get that with workstation cards and their certified drivers. I have first hand experience of this.

But I am not sure how off topic we are going here :)
 
based on the numbes in the chart above, the nMP wasn't performing as well as even a single R9 280 / HD 7970.

From what I understand so far the 280 is the closest match to the D700. possible that it wasn't running in crossfire mode in bf4 in their demo?

the 3dmark scores definitely suggest crossfire was working though. maybe something that will improve with driver updates? or using the AMD drivers rather than Apple-provided ones?

Maybe. Also keep in mind the baseclock on the D700 is 650Mhz and a boost of 850mhz whereas many 280X have a base clock of 1000mhz or more. Perhaps 3D mark (the first test he ran) was a cold benchmark and while hot, it doesn't achieve a high boost in CF.

CF generates a lot of heat. This is why I've been saying we need more thorough testing.
 
CF generates a lot of heat. This is why I've been saying we need more thorough testing.

Not sure what to make of that statement. The GPU's are doing work, and CF by itself should generate very little heat as it's simply bus/DMA overhead. Without CF a multi GPU system will idle all the cores except the one being used. With CF all the GPU's are working, so by definition it will generate more heat.

The nMP thermal budget is designed to accommodate both GPU's working so I don't see the issue.
 
Not sure what to make of that statement. The GPU's are doing work, and CF by itself should generate very little heat as it's simply bus/DMA overhead. Without CF a multi GPU system will idle all the cores except the one being used. With CF all the GPU's are working, so by definition it will generate more heat.

The nMP thermal budget is designed to accommodate both GPU's working so I don't see the issue.

I hope that there is zero heat/thermal issues as there are times where i will run this machine flat out 24/7 for long periods. I don't care if it gets loud during these times.
 
OK, now that we've established that that CrossFire is enabled, the next consideration is what drivers? AMD has been putting a huge amount of work into their consumer drivers recently to address frame pacing issues, and recently have released support for frame pacing for CrossFire and Eyefinity.

I only play Lord of the Rings Online, and have been doing so on my 2009 MP with a Sapphire 7950 in a 3 monitor Eyefinity setup. Previously the stuttering was bad enough to keep me from using this setup, however with the just released drivers it's much better and quite playable.

Presumably Bootcamp installed the FirePro drivers, is it a simple matter to install the 7970 drivers? Do you still get Crossfire in that case? It's possible there is a special build of the FirePro drivers to support the D700's.
 

Almost, scrypt.
(anyone directly mining bitcoins with a computer is 'doing it wrong' :))

Yes, to most people 'Workstation' does mean stability. That's my problem though - Everything is inherently the same and artificially imposed issues really bug me. With my workloads I can see pretty much perfect performance scaling between cards just by looking at shader quantity/core clock/etc - I don't have to worry about what it says on the box, it's how it should be (in my naive little mind).

I certainly can't argue with your real-world experience (there's so many comments on these forums about ggpu/firepro/performance/etc by non-ggpu users that assume quality/performance always comes with price and that compromises have to be made) and appreciate that if you've used two products that perform differently then it's very easy to make a purchasing decision.

I don't think it's off topic :) If it helps put it back on, I'm still trying to understand if crossfire on these cards is a physical bridge or, by some miracle, through the bus.
 
Not sure what to make of that statement. The GPU's are doing work, and CF by itself should generate very little heat as it's simply bus/DMA overhead. Without CF a multi GPU system will idle all the cores except the one being used. With CF all the GPU's are working, so by definition it will generate more heat.

That's exactly what I meant. Clearly when not in use, the GPUs will be fine, CF or not. CF gaming is going to essentially double the heat, and you still have the CPU.

The nMP thermal budget is designed to accommodate both GPU's working so I don't see the issue.

Absolutely, I highly doubt the computer will shut off. However, I would really like to know what clock the GPUs are running at during CF gaming after the machine warms up. Apple says the "thermal core" is the greatest thing ever and is just as good as 3 individual fans and heat sinks... I just want proof is all.

That's why I'd like to see some "hot" benchmarks. That's the way NCIX benchmarks and I think it's the most applicable, especially in this case.

Would you really be surprised if the nMP used a lower clock when using both GPUs and the CPU?

----------

OK, now that we've established that that CrossFire is enabled, the next consideration is what drivers? AMD has been putting a huge amount of work into their consumer drivers recently to address frame pacing issues, and recently have released support for frame pacing for CrossFire and Eyefinity.

I only play Lord of the Rings Online, and have been doing so on my 2009 MP with a Sapphire 7950 in a 3 monitor Eyefinity setup. Previously the stuttering was bad enough to keep me from using this setup, however with the just released drivers it's much better and quite playable.

Presumably Bootcamp installed the FirePro drivers, is it a simple matter to install the 7970 drivers? Do you still get Crossfire in that case? It's possible there is a special build of the FirePro drivers to support the D700's.

I would also like to know how the drivers behave using Professional Apps in Windows. Not to sound like a broken record, but again: I'd like hot benchmarks please.

I believe the drivers for the D700 in Windows are fine, probably nearly identical to either FirePro or 7970 drivers (because they are nearly identical GPU). We may be seeing the maximum performance these cards can do.

I think the discrepancy between the geekbench and BF4 performance was not a driver problem, but a down-clock due to heat.
 
Impressive news, for both Crossfire support and cards recognized as Fire by Windows and applications.

As AMD's Crossfire use some special hardware cable on retail PCI-E, I was not sure about Bootcamp support.

And as Fire Driver is what make W9000 expensive, I believed that Dxx will be support as Radeon and not Fire.

Altough I'm almost sure that AMD itself will not "officialy" support it on certified windows product line, that's not the most important.

Now I'm just wondering if D500/700 use (or not) ECC VRAM. We got many prospective about it but still not any proper software confirmation.
 
Absolutely, I highly doubt the computer will shut off. However, I would really like to know what clock the GPUs are running at during CF gaming after the machine warms up. ... Would you really be surprised if the nMP used a lower clock when using both GPUs and the CPU?

Got it, yeah, do they scale back under load? I should be surprised as then it would seemingly contradict the 7 Tflops spec, as you only get that under, presumably sustained, full load. Another hint is that Apple has carefully tweaked these parts. Special clocks, cores etc. It seems likely they did this to manage the thermal envelope. If they scaled back under load why not scale up otherwise?

Finally, I work in a similar industry, and I know that thermals are one the highest concerns. Hardware engineers don't usually play games with this; they set the ceiling at the peak performance under full load, and include a guard band. I doubt the Apple engineers would gimp this, but who knows, Apple engineers do "think different".

----------

Now I'm just wondering if D500/700 use (or not) ECC VRAM. We got many prospective about it but still not any proper software confirmation.

Yes I'm wondering that too. It seems more than likely since these are meant for GPU compute, which is what OpenCL is all about.
 
do they scale back under load? ...
Yes I'm wondering that too.

They'll automatically scale back if they hit a temperature threshold of course. I assume the same happens if it's possible for the entire system to hit a specific power load, 450w sounds awfully small!

No reason they couldn't have ECC, tahiti supports it obviously. However, I live in my own selfish little world but don't really see the point, even if the ram price premium is low (~20%?) to a giant like apple. I can't see anyone running critical loads for months on end on a little desktop device like this - EDC ram is crucial to performance scaling on modern gpus and is perfectly adequate?
 
Crossfire has always worked under windows in mac pro... it's not hardware dependant :rolleyes:

It worked even if did not hook up the Crossfire sockets on the two cards?
If so not hardware dependent. If required the connection then it is.

It appears that Apple did though pins and traces at moving the proprietary Crossfire data. It isn't a edge connector they are using on the GPU cards so perhaps the incremental costs were offset by the upsides of getting more capabilities out of the Windows drivers they are licensing.

Given the trend that AMD is on with CFX (over PCI-e), I now don't think there is a big future for this iteration of Apple's connector which likely means this would be the only set of cards that will ever come to this iteration of the Mac Pro. There are probably some short term vs long term trade-offs in what they decided on here.
 
They'll automatically scale back if they hit a temperature threshold of course.

Correct, the question is where is that threshold and will it be hit?

With 450W and a 130W CPU that leaves 320W. Say 20W for support, that leaves 150W for each GPU. Most 7970's are specced for 250W, so that leaves us with a seeming 100W deficit. However, regular cards have over clocking headroom which Apple doesn't need to support. I bet that those GPU's, at that clock really only need 150W.
 
It worked even if did not hook up the Crossfire sockets on the two cards?
If so not hardware dependent. If required the connection then it is.

It appears that Apple did though pins and traces at moving the proprietary Crossfire data. It isn't a edge connector they are using on the GPU cards so perhaps the incremental costs were offset by the upsides of getting more capabilities out of the Windows drivers they are licensing.

Given the trend that AMD is on with CFX (over PCI-e), I now don't think there is a big future for this iteration of Apple's connector which likely means this would be the only set of cards that will ever come to this iteration of the Mac Pro. There are probably some short term vs long term trade-offs in what they decided on here.

It's a proprietary connector for their own board, I imagine adding a few extra tracers to link the cards would effectively cost nothing :)
I agree and also think that think these will be the only cards for this iteration of the nMP, but I don't see a long term trade-off - Current generation gpus use the bus directly so keeping exactly the same connector is only going to leave some unused pins?
 
That's exactly what I meant. Clearly when not in use, the GPUs will be fine, CF or not. CF gaming is going to essentially double the heat, and you still have the CPU.



Absolutely, I highly doubt the computer will shut off. However, I would really like to know what clock the GPUs are running at during CF gaming after the machine warms up. Apple says the "thermal core" is the greatest thing ever and is just as good as 3 individual fans and heat sinks... I just want proof is all.

That's why I'd like to see some "hot" benchmarks. That's the way NCIX benchmarks and I think it's the most applicable, especially in this case.

Would you really be surprised if the nMP used a lower clock when using both GPUs and the CPU?

----------



I would also like to know how the drivers behave using Professional Apps in Windows. Not to sound like a broken record, but again: I'd like hot benchmarks please.

I believe the drivers for the D700 in Windows are fine, probably nearly identical to either FirePro or 7970 drivers (because they are nearly identical GPU). We may be seeing the maximum performance these cards can do.

I think the discrepancy between the geekbench and BF4 performance was not a driver problem, but a down-clock due to heat.
If we consider the gaming scenario only, then I do not know of any games that require massive CPU.. Therefore it is possible that there will be headroom for the GPUs in cross fire. All speculation of course.
 
It worked even if did not hook up the Crossfire sockets on the two cards?
If so not hardware dependent. If required the connection then it is.

I missed that the topic was about the nMP AMD cards in specific, I thought it was just about the nMP.

There is nothing to stop the nMP from using Crossfire in windows, as Crossfire is not hardware dependant. By hardware dependant, I mean that the motherboard needs to support it. But you are correct in that the proprietary AMD cards themselves could be missing the xfire support, like in terms of firmware support.

However, the present and future cards from AMD will not require a physical Crossfire bridge between the cards. See here.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.