Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
First off, most of the power cutting of the 7990 is through binning (the chips themselves, according to that article). Second, further power savings by consolidating the circuitry of 2 cards into 1 may or may not be significant, but even if it is: Isn't imagining a master/slave card being linked over the PCIE 3.0 port of the nMP a bit of a stretch?

All we've figured out so far: If Apple is just using scaled-down W9000's, the wattage requirement relative to the 450W spec is astronomical. Through binning (like on the 7990 reference does), they can shave off quite a lot. Through consolidation of circuitry, they can perhaps shave a little more. Through quick-idle and smart distribution of resources, perhaps more can be shaved (though I'm skeptical as to which tasks and how much).

Even if all these were rolled into one product (currently conjecture), we'd still be at a pretty significant power deficit.

I think all you guys are missing my point. You cannot pull 275 watts out of 365mm^2 of silicon without high flow chilled coolant and maybe welded heat exchangers.

Sorry but the idea that some die bin out at 275 watts and others at 150 watts is ridiculous. If some die can run at 20% less voltage for a given speed, they will consume 20% less power but to believe 100% voltage variation die to die is crazy.

I don't claim to know what the current budgets are for each chip in the new mac pro but to assume throttling is premature. Most of the unicorn hair and pixel dust is in this thread.
 
First off, most of the power cutting of the 7990 is through binning (the chips themselves, according to that article). Second, further power savings by consolidating the circuitry of 2 cards into 1 may or may not be significant, but even if it is: Isn't imagining a master/slave card being linked over the PCIE 3.0 port of the nMP a bit of a stretch?

All we've figured out so far: If Apple is just using scaled-down W9000's, the wattage requirement relative to the 450W spec is astronomical. Through binning (like on the 7990 reference does), they can shave off quite a lot. Through consolidation of circuitry, they can perhaps shave a little more. Through quick-idle and smart distribution of resources, perhaps more can be shaved (though I'm skeptical as to which tasks and how much).

Even if all these were rolled into one product (currently conjecture), we'd still be at a pretty significant power deficit.

Good article.

Here is the link if anyone is interested.

http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official/16

They ultimately come to the conclusion that 7990 has no advantage over GTX690, but uses an extra 75 Watts doing so. Makes Apple's decision to use it more questionable, if goal was quiet and low power use, sounds like a pair of 680s would have gotten same GPU hp with 75 Watts less power in and 75 Watts less heat to dissipate.

They also complain that the card took 17 months to come out after 7970. So it was considered "dated" back in April. The more things change, the more they stay the same.

And even if the number is 375 Watts, 375 Watts + 130 Watts = 505 Watts, already over the quoted 450 Watts WITHOUT any of the other system pieces added in.

So the original premise of the thread stands. Either throttling is going to happen, or the upgraded GPUs require a different PSU, meaning upgrades move into the "nearly impossible" range. (ie, might as well just buy a new one)

As I said before, Apple's D700 duo produces 7TFlops vs. AMD's 7990's 8.2 which means they are down-clocking it which in turn means it will get it into the range of 325W. Furthermore, this late in the product cycle (as many of you are fond of pointing out), it's extremely likely that yields and binning have improved further since the 7990 was first launched.

Of course, with a 130W TDP CPU, with GPUs in the 300-325W range, that's still pushing the limits of a 450W power supply. However, while just summing the max power of every electrical component at full load is one way of determining your power budget, it can be unnecessarily excessive. I wouldn't be surprised if Apple ran their top configuration through a variety of stress tests to determine what the PSU capacity needed to be, and 450W is what was deemed necessary.

Also, MVC... unless I'm mistaken the GTX690 is 5.6TFlops at 300W which is 18.6 GFlops/W compared to AMD's 7990 8.2TFlops at 375W which is 21.9 GFlops/W. So a pair of Tahiti XT cores does have a performance/watt advantage over a pair of GK-104's in GPGPU tasks. And, last but not least, even Apple's down-clocked D700s will outperform a GTX690 in the TFlop arena.
 
Starting to struggle with these threads!

As I said before, Apple's D700 duo produces 7TFlops vs. AMD's 7990's 8.2 which means they are down-clocking it which in turn means it will get it into the range of 325W. Furthermore, this late in the product cycle (as many of you are fond of pointing out), it's extremely likely that yields and binning have improved further since the 7990 was first launched.

Of course, with a 130W TDP CPU, with GPUs in the 300-325W range, that's still pushing the limits of a 450W power supply. However, while just summing the max power of every electrical component at full load is one way of determining your power budget, it can be unnecessarily excessive. I wouldn't be surprised if Apple ran their top configuration through a variety of stress tests to determine what the PSU capacity needed to be, and 450W is what was deemed necessary.

Also, MVC... unless I'm mistaken the GTX690 is 5.6TFlops at 300W which is 18.6 GFlops/W compared to AMD's 7990 8.2TFlops at 375W which is 21.9 GFlops/W. So a pair of Tahiti XT cores does have a performance/watt advantage over a pair of GK-104's in GPGPU tasks. And, last but not least, even Apple's down-clocked D700s will outperform a GTX690 in the TFlop arena.

So, if I read this right, MVC and others are saying the current nMP spec sheets mean it can't work, or will not work as advertised! Physics says so!

I can see his argument, though it's 30+ yrs since I've done much physics so I bow deferentially to his knowledge.

This is were I struggle with his and other arguments though.

We have seen Mari and Pixar do some serious 8k texture 3d heavy lifting live at WWDC. GPU intensive operations that would normally need a separate 12hr render, being manipulated in real time on stage. Mari have or are just about to release Mari on OSX as well and I'm sure they're not targeting the the Home hobbyist!

There's gonna be some serious egg on faces and probably law suites if they weren't upfront and honest or fabricated that demonstration, when Mari and nMP are used in the real world.

We have seen and I'm aware of the Apple hype at at WWDC, the Mari user saying he's never seen it run smoother! And remember that Mari was used in 9 out of 10 of the short listed visual effects films in the 2013 Oscars. This is a serious package for very serious work.

Some big reps are on the line and we have neigh sayers on here saying that Apple can only have ******** up! Really?!
 
Some big reps are on the line and we have neigh sayers on here saying that Apple can only have ******** up! Really?!

No, from the very first post the possibility was stated that the more powerful models might simply have more powerful power supplies.
 
So, if I read this right, MVC and others are saying the current nMP spec sheets mean it can't work, or will not work as advertised! Physics says so!

I can see his argument, though it's 30+ yrs since I've done much physics so I bow deferentially to his knowledge.

This is were I struggle with his and other arguments though.

We have seen Mari and Pixar do some serious 8k texture 3d heavy lifting live at WWDC. GPU intensive operations that would normally need a separate 12hr render, being manipulated in real time on stage. Mari have or are just about to release Mari on OSX as well and I'm sure they're not targeting the the Home hobbyist!

There's gonna be some serious egg on faces and probably law suites if they weren't upfront and honest or fabricated that demonstration, when Mari and nMP are used in the real world.

We have seen and I'm aware of the Apple hype at at WWDC, the Mari user saying he's never seen it run smoother! And remember that Mari was used in 9 out of 10 of the short listed visual effects films in the 2013 Oscars. This is a serious package for very serious work.

Some big reps are on the line and we have neigh sayers on here saying that Apple can only have ******** up! Really?!

Don't bring your logical thought to these "the sky is falling on our heads" threads. ;)
 
No, from the very first post the possibility was stated that the more powerful models might simply have more powerful power supplies.

Yup, Occam's razor at its best. Binning, throttling, unknown chips, thermal envelopes and whatever we can dream up is kinda unnecessarily complicated when they can just affix a beefier power supply to the configurations they have yet to release specs for. Apple released some specs for two systems, the quad Xeon with dual D300's and the hexacore Xeon with dual D500's. The specs listed 450W. There is a reason they didn't bother releasing pricing and specs for any other models. They didn't want to sticker shock people and/or they didn't want to confuse things with two separate specs for power and acoustics by having to put *'s all over the spec sheet that the lower numbers only apply to models with 450w psu's. We will undoubtedly know more as the shipping date of these machines approach but I lean towards the simpler explanations myself.
 
[G5]Hydra;18334276 said:
Yup, Occam's razor at its best. Binning, throttling, unknown chips, thermal envelopes and whatever we can dream up is kinda unnecessarily complicated when they can just affix a beefier power supply to the configurations they have yet to release specs for.

An even simpler explanation is that the specs are accurate, and Apple's just down-clocking, just as they have for many of their products in the past :)

325Watts to run GPU at full boar is easily possible in the nMP with 450W--just as long as it's a GPU-only task and you have no Bus-powered TB or USB devices plugged in. The CPU's idle power is extremely low.

It'd be hilarious if your mobile hard drive cost you 10 FPS when you leave it connected...
 
Last edited:
An even simpler explanation is that the specs are accurate, and Apple's just down-clocking, just as they have for many of their products in the past :)

375Watts to run GPU at full boar is easily possible in the nMP with 450W--just as long as it's a GPU-only task and you have no Bus-powered TB or USB devices plugged in. The CPU's idle power is extremely low.

It'd be hilarious if your mobile hard drive cost you 10 FPS when you leave it connected...

No, downclocking/binning enough to get a w9000 anywhere near the needed 120W per GPU wouldn't get you close to 3.5TFLOPS. Apple in the past has downclocked GPU's that went into MP's but we are talking 10% here, not 50%. The people/organizations targeted by the new MP, especially the 12 core dual d700 configs, buy these things for what they can do. Marketing doesn't apply to them. They look at it as simple equation in how many $$$/hr they can generate of work. A machine that can't do what they need when run flat out is worse than useless because it costs them money. Besides some well connected people in various industries are already sampling these things, if Apple screwed the pooch we'd be hearing horror stories already...
 
[G5]Hydra;18334437 said:
No, downclocking/binning enough to get a w9000 anywhere near the needed 120W per GPU wouldn't get you close to 3.5TFLOPS. Apple in the past has downclocked GPU's that went into MP's but we are talking 10% here, not 50%.


Read this:
As I said before, Apple's D700 duo produces 7TFlops vs. AMD's 7990's 8.2 which means they are down-clocking it which in turn means it will get it into the range of 325W. Furthermore, this late in the product cycle (as many of you are fond of pointing out), it's extremely likely that yields and binning have improved further since the 7990 was first launched.

450-325 = 125W leftover for the CPU and other components--that's plenty to get the 7TFLops of GPU power Apple advertises, as long as the CPU remains mostly-idle (as it does in many GPU benchmarks).


[G5]Hydra;18334437 said:
The people/organizations targeted by the new MP, especially the 12 core dual d700 configs, buy these things for what they can do. Marketing doesn't apply to them. They look at it as simple equation in how many $$$/hr they can generate of work. A machine that can't do what they need when run flat out is worse than useless because it costs them money. Besides some well connected people in various industries are already sampling these things, if Apple screwed the pooch we'd be hearing horror stories already...

That's quite an elaborate string of logic, but you're basically saying that Apple can't be throttling on a product that hasn't been released yet and has no independent benchmarks made public because someone would have called them out on it?

As far as "some well connected people" having nMP already (I cry baseless conjecture), you know that those people are probably 1,000% sure to have signed a NDA, right?

How many unauthorized benchmarks do we see of any technology before it's released? How many were Apple products?
 
No, from the very first post the possibility was stated that the more powerful models might simply have more powerful power supplies.

And if that's so, then what? What's the consequence! Are we saying that the db ratings quoted can't be held! I would be surprised if Apple would quote them as best case only, they would have quoted mean values over the 4 systems.

By my reading, the intonation in some of the posts is that it all can't be done even for the quoted systems. All I was saying was that it can be done, we've seen it and we've seen it heavy lifting. From what I remember it didn't sound like a jet engine either.

As to the upgradability I'd be surprised if it's any more than RAM and SSD at best. GPU's doubtful. But you buy the system for your needs now and expected. If it doesn't match up, don't buy it. But at least wait and see what the full specs across the range are before you slam it........it's IPad, iPhone slamming all over again.....and then they launch and we know the rest!
 
An even simpler explanation is that the specs are accurate, and Apple's just down-clocking, just as they have for many of their products in the past :)

That's a simpler explanation only, not a more likely one. We'll have to wait and see.

----------

Are we saying that the db ratings quoted can't be held! I would be surprised if Apple would quote them as best case only, they would have quoted mean values over the 4 systems.

I'd say before WWDC, we wouldn't have believed that even a 6 core Xeon system with dual GPU's could run at 12 dB's.
 
But at least wait and see what the full specs across the range are before you slam it........it's IPad, iPhone slamming all over again.....and then they launch and we know the rest!

It's a rumors site, so speculation is normal.

Otherwise it'd be MacVerifiedFacts.com!
 
I cry baseless conjecture)

I forgot where anything anyone posts here isn't baseless conjecture. You are guesstimating wattage of unknown GPU's and how they will be throttled back to the point of near uselessness. Why bother with two expensive GPU's if you need to cripple them no? Just run a single one and be done with it. It's all baseless conjecture. Answer this, if Apple is pulling the wool over everyone's eyes, even pro's who know about these things, why did they bother to spec the D300 at 2.0 TFLOPS and the D500 with 2.2 TFLOPS? They might as well have split the difference and made it run 2.7 or more TFLOPS with these being all throttled and not possible with the CPU running more than idle anyway?
 
[G5]Hydra;18334936 said:
I forgot where anything anyone posts here isn't baseless conjecture.

Fair enough, you just presented it as fact was all :)

[G5]Hydra;18334936 said:
You are guesstimating wattage of unknown GPU's and how they will be throttled back to the point of near uselessness. Why bother with two expensive GPU's if you need to cripple them no?

Small form-factor with 18dBA? Some people would make that sacrifice. Besides, I wonder how a power deficit would affect performance of a lot of apps since it seems that the CPU or the GPU can run full boar just fine, as long as it's not at the same time.
 
Fair enough, you just presented it as fact was all :)



Small form-factor with 18dBA? Some people would make that sacrifice. Besides, I wonder how a power deficit would affect performance of a lot of apps since it seems that the CPU or the GPU can run full boar just fine, as long as it's not at the same time.

Surely there must be some apps that tax the GPU and the CPU to the fullest.
 
Surely there must be some apps that tax the GPU and the CPU to the fullest.

I'm sure there are a lot of them (games, to some extent, among others), otherwise this thread would be irrelevant regardless of if the conjecture is true or not. I was just saying it's not a forgone conclusion Apple didn't make that sacrifice for you.
 
Fair enough, you just presented it as fact was all :)



Small form-factor with 18dBA? Some people would make that sacrifice. Besides, I wonder how a power deficit would affect performance of a lot of apps since it seems that the CPU or the GPU can run full boar just fine, as long as it's not at the same time.

Look, I'm not arguing what you post isn't possible. Sure it is but it doesn't make sense. Why bother even specifying more than a quad Xeon then? It's all about getting enough power for the GPU's and the CPU is a vestige at that point. Why spend nearly $3000 on a single piece of silicon if you can't use it if the GPU's are humming? The problem with general purpose computing is well er.. they are general purpose. Sure with MP is a targeted workstation but what happens if someone develops a killer app that needs both CPU and GPU? Apple puts an embarrassing disclaimer up that says * performance of GPU and CPU cannot be guaranteed when run concurrently. It just would be such a mess IMHO.
 
[G5]Hydra;18335046 said:
Look, I'm not arguing what you post isn't possible. Sure it is but it doesn't make sense. Why bother even specifying more than a quad Xeon then? It's all about getting enough power for the GPU's and the CPU is a vestige at that point. Why spend nearly $3000 on a single piece of silicon if you can't use it if the GPU's are humming? The problem with general purpose computing is well er.. they are general purpose. Sure with MP is a targeted workstation but what happens if someone develops a killer app that needs both CPU and GPU? Apple puts an embarrassing disclaimer up that says * performance of GPU and CPU cannot be guaranteed when run concurrently. It just would be such a mess IMHO.

I totally agree, the only thing I don't agree on is that this reality does not make it an impossibility that Apple would do this. Running 500+ watts through the iTube may require more RPM from the fan than Apple is ready to live with.

In the end, we'll just see. I won't be surprised either way.
 
I totally agree, the only thing I don't agree on is that this reality does not make it an impossibility that Apple would do this. Running 500+ watts through the iTube may require more RPM from the fan than Apple is ready to live with.

In the end, we'll just see. I won't be surprised either way.

Nothing is impossible obviously. What we are talking about here are probabilities of different scenarios. I'd be surprised if CPU or GPU are being throttled, and I wouldn't be surprised if the higher end models have more powerful PSU's and are louder.
 
Nothing is impossible obviously. What we are talking about here are probabilities of different scenarios. I'd be surprised if CPU or GPU are being throttled, and I wouldn't be surprised if the higher end models have more powerful PSU's and are louder.

Definitely possible! Of course, that leads to MVC's point: If there are 2 models with 2 proprietary PSUs and likely 2 proprietary motherboards, then you will absolutely not be able to upgrade the video cards of the lower models to anything that uses more power.
 
Nothing is impossible obviously. What we are talking about here are probabilities of different scenarios. I'd be surprised if CPU or GPU are being throttled, and I wouldn't be surprised if the higher end models have more powerful PSU's and are louder.

OK, I'll drop shoe #1 again:

" It works by conducting heat away from the CPU and GPUs and distributing that heat uniformly across the core. That way, if one processor isn’t working as hard as the others, the extra thermal capacity can be shared efficiently among them."

What is shoe #2?

What happens when there isn't "extra thermal capacity " left?
 
Definitely possible! Of course, that leads to MVC's point: If there are 2 models with 2 proprietary PSUs and likely 2 proprietary motherboards, then you will absolutely not be able to upgrade the video cards of the lower models to anything that uses more power.

Umm... you left out the most important "proprietary" component of all in this scenario ... the GPU cards themselves. Nobody is going to make these cards and if they do they are going to be so expensive it will be cheaper to just sell your old new MP and buy a new one. Apple intends you to swap out the GPU's in the new MP about as much as they intend you to swap out the GPU in a MBP or iMac. The iMac a few years ago had a GPU on a little daughter card too but I don't recall seeing tons of options for that either. When you want new GPU's Apple has a solution... Buy a new MP;) It sucks but that's the way it is on OSX and Macs now.
 
[G5]Hydra;18335250 said:
Umm... you left out the most important "proprietary" component of all in this scenario ... the GPU cards themselves. Nobody is going to make these cards and if they do they are going to be so expensive it will be cheaper to just sell your old new MP and buy a new one. Apple intends you to swap out the GPU's in the new MP about as much as they intend you to swap out the GPU in a MBP or iMac. The iMac a few years ago had a GPU on a little daughter card too but I don't recall seeing tons of options for that either. When you want new GPU's Apple has a solution... Buy a new MP;) It sucks but that's the way it is on OSX and Macs now.

Agreed. However, many think that they will swap out the cards with ones Apple makes for future Mac Pros (there are threads about this, and a story on the front page here).
 
Agreed. However, many think that they will swap out the cards with ones Apple makes for future Mac Pros (there are threads about this, and a story on the front page here).

Very true but have you ever perused Apple OEM replacement parts lists? The prices will make you dizzy. Also if Apple goes the two different PSU route they will likely always have a couple of lower power cards and a higher power card, so there will always be options. The higher powered cards will likely cost you a couple grand minimum to get a hold of. At that point doesn't it just make sense to sell you MP for decent money and buy a new one instead of plunking another few grand into a machine that is already old? Also to keep in mind these cards and connectors are all proprietary as you mentioned before. What is to stop Apple form changing them all around every revision. PCIe cards had to be held to standards but Apple's own internal working are theirs to play with as they please. They make no assurances that any card they build in the future will even fit physically into an older machine.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.