http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/3
Yes, they have it wrong, because they show maximum available power based on calculations from molex and PCIe.
I will say this again. AMD is able to get FULL FIJI core to 175W of max TDP with typical power draw around 157W, which I think you, MVC, completely forgot. Or ignored, just because you can. By the numbers we know it looks like Fury Nano will be running on 950 MHz-975 MHz. Fury X from that thread linked above goes from 375W TDP with 290W nominal power draw to 225W Max power draw with only 15 MHz smaller Core clock.
P.S. Dual Fiji setup running at 850 MHz will get 14 TFLOPS of performance. Over twice more than Titan X in the same power envelope.
So now the international conspiracy to make Fury look like a power hog INCLUDES Asus, one of the manufacturers?
But you, a consumer sitting at home who has never laid hands on one, knows better then the manufacturer?
That's not hubris at all, is it?
And I hate to break it to you, but 225 Watts is still 100 Watts too much PER CARD, or 200 Watts total beyond nMP PSU.
So the Fat Lady is going to have to eat a few more salads before she can sing from inside the nMP.
And am I the only one to see the absurd folly of all this discussion?
We are debating how much they will have to strangle the Fury to make it fit in an expensive workstation. Meanwhile, the cMP can already have cards in it that beat the Fury-X, with no artificial limits placed on it by lack of power infrastructure.
So, maybe, good golly gosh, if we are super duper lucky, someone will figure out a way to only mildly strangle the 3rd fastest card on the market so that it can function on 125 Watts. And if all goes well, we'll only have to wait 6 more months to find out how much slower it is then the cards you can already put in cMP. Isn't this insanity?
(Sing along with the Beach Boys)
Wouldn't it be nice if Fury used less power,
So it could run in the new Mac Pro.
And wouldn't it be nice if it wasn't so hot,
So it wouldn't fry the CPU.
Imagine if it were a real computer,
Then we wouldn't have to make excuses.
Oh wouldn't it be nice?