Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, and R9 380X is around 20% faster using 15% more power.

There is only one set of benchmarks out there and the R9 380X is about ten percent faster than GM204 in the only benchmark out there (Battlefield) but slower than the GM200, if the benchmark is even real. 20% is a ridic claim.

Secondly, getting 3.5 Tflops from 129W of power is power efficiency bigger than Nvidia is capable. GCN architecture is the most power efficient architecture in the world, right now, but mainstream does not admit it.

Ignoring the conspiracy theory at the end in which the mainstream are somehow hiding the truth or wearing blindfolds, you will have to explain why some of the worlds fastest supercomputers and rendering farms have chosen Nvidia's solutions over AMD.

I gave AMD the credit they deserve but then you go and take it to a whole other level.


At whatever work GTX980 draws around 280W, or... it is really slow. Sustained power envelope for Maxwell cards is a problem.

This was a stupid remark to make, it flies in the face of everyone's experience and every thorough review, and I'm ignoring you now.
 
Last edited:
There is only one set of benchmarks out there and the R9 380X is about ten percent faster than GM204 in the only benchmark out there (Battlefield) but slower than the GM200, if the benchmark is even real. 20% is a ridic claim.



Ignoring the conspiracy theory at the end in which the mainstream are somehow hiding the truth or wearing blindfolds, you will have to explain why some of the worlds fastest supercomputers and rendering farms have chosen Nvidia's solutions over AMD.

I gave AMD the credit they deserve but then you go and take it to a whole other level.




This was a stupid remark to make, it flies in the face of everyone's experience and every thorough review, and I'm ignoring you now.
Im just repeating words people from industry.

In Games TDP of Maxwell GPUs is really around 185w. In OpenCL is higher, to give the best power, in sustained scenarios of work. Thats the experience of using Maxwell GPUs to other things than gaming and benchmarking.

Again, compare Tflops of AMD GCN with Tflops of Nvidia's cards. That is mark of power efficiency. And it is why Apple went for AMD GPUs.
 
Last edited:
Well I don't just repeat some exaggeration from someone. I prefer to cite sources, but even the single source we have is not confirmed genuine as I mentioned prior.

Here is the supposed benchmark tests. Like I said, AMD Is performing faster but consuming more power. Thus making efficiency about equal. I for one don't need five extra frames per second if it adds a month's worth of electricity to my yearly bill.

The R9 390X slightly ahead of GM200.
The GM200s slightly ahead of the R9 380.
The 195w R9 380 slightly ahead of the 165w GTX 980.

Nvidia-GM200-TItan-2-AMD-Fiji-Bermuda.png


Performance relative to power consumption is measured yellow bar relative to black bar. The greater the difference the worse the efficiency. As you can see, power efficiency is very similar in the upcoming generation. Previously the 780Ti and AMD had dreadful power consumption. The GTX 970 and 980 had amazing efficiency, the best on the whole chart.

The good news is next gen cards can be installed in a cMP without external power sources. The Nvidia cards will work immediately even with current web drivers. Don't know if that is true for the AMD cards.

Nvidia-GM200-TItan-2-AMD-Fiji-.png
 
Nvidia GTX 980 - 4.6 Tflops from 185W TDP.
Radeon R9 380X - 7-8 Tflops from 200W TDP.

Which architecture is more power efficient?

Gaming power is not power efficiency.

GCN architecture is most power efficient GPU architecture on the planet. Period.

Edit: One more thing...

http://semiaccurate.com/forums/showthread.php?t=8364
In the first post are benchmarks for R9 380X. As we can see there, R9 is 20% faster and uses 5% more power.

Try argue more. Those benchmarks come from the same source as yours. ChipHell.com.
 
Last edited:
Nvidia GTX 980 - 4.6 Tflops from 185W TDP.
Radeon R9 380X - 7-8 Tflops from 200W TDP.

Which architecture is more power efficient?
Gaming power is not power efficiency.
GCN architecture is most power efficient GPU architecture on the planet. Period.

Period what? If it's that time of the month I can understand your anger and confusion. Let's breakdown your latest codswallop.

- Radeon R9 380X. 7-8 Tflops from 200w.
Make up your mind, there's a big difference between 7 and 8 Tflops. Nevertheless, it's an unconfirmed number.

-As the power consumption chart I posted shows the 380X has a power consumption of around 225w under heavy load and the GTX 980 consumes about 185w under the same load. You got the numbers completely wrong.

-GTX 980 is confirmed 5 Tflops , you cut it down to 4.6 for fun. That GPU was Nvidia's response to the R9 290X. So why compare it against a card AMD hasn't released yet? You should compare GM204 cards to R9 200X series and then compare GM200 cards to R9 300X series. You are mixing generations.

Here is the chart for cards that do currently exist. Again, GTX 980 specs are 165w with 5 Tflops. R9 290X is 290w for 5.6 Tflops. That's ****ing terrible efficiency from AMD.

http://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed/3

I don't care for fanboyisms and broisms. Gimme facts or go home.
 
Dual CPU, single faster GPU. 256GB RAM, and 2 PCI-e slots for SSD. Also do it in blue so it looks like one of the old SGI workstations. ;o)
 
Period what? If it's that time of the month I can understand your anger and confusion. Let's breakdown your latest codswallop.

- Radeon R9 380X. 7-8 Tflops from 200w.
Make up your mind, there's a big difference between 7 and 8 Tflops. Nevertheless, it's an unconfirmed number.

-As the power consumption chart I posted shows the 380X has a power consumption of around 225w under heavy load and the GTX 980 consumes about 185w under the same load. You got the numbers completely wrong.

-GTX 980 is confirmed 5 Tflops , you cut it down to 4.6 for fun. That GPU was Nvidia's response to the R9 290X. So why compare it against a card AMD hasn't released yet? You should compare GM204 cards to R9 200X series and then compare GM200 cards to R9 300X series. You are mixing generations.

Here is the chart for cards that do currently exist. Again, GTX 980 specs are 165w with 5 Tflops. R9 290X is 290w for 5.6 Tflops. That's ****ing terrible efficiency from AMD.

http://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed/3

I don't care for fanboyisms and broisms. Gimme facts or go home.

Amount of GCN Cores * 2/Core Clock = Tflops.
Thats the equation for GCN core architecture, to count the performance of the card.
So, if the clock will be 850 MHz on 4096 GCN core GPU, there will be 7 Tflops. If it will be 1 GHz GPU will get 8.1 Tflops. Looking by pure numbers its more 850 MHz, or... the drivers are not refined for that card, thats why it performs worse than it should. Yes, it is performing worse than it should.

I dont compare GTX to cards that have 3 years. Im comparing them to those which are coming. Because those are what count.

P.S. Yes, I am a Fanboy. Nvidia fanboy. Every computer I have had was with Nvidia GPU. Yet, still I can see beyond it.
 
Dual CPU, single faster GPU. 256GB RAM, and 2 PCI-e slots for SSD. Also do it in blue so it looks like one of the old SGI workstations. ;o)

Yep, the O2 was gorgeous and was modular with pull out PCI slots. I saw it in person running Softimage when it was released at Siggraph. But it didn't sell because Windows NT workstations cut SGI down.

30042007005.jpg


sgi_open1.jpg
 
Last edited:
Yep, the O2 was gorgeous and was modular with pull out PCI slots. I saw it in person running Softimage when it was released at Siggraph. But it didn't sell because Windows NT workstations cut SGI down.

Image

Image

I thought they were very cool at the time, but well beyond my reach and in all honesty rather niche in their application support - i.e. it would have looked good but I wouldn't have had a genuine use for it.

Having said that Apple could learn a thing or two from the design. Small shouldn't mean lack of internal expansion.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.