Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
He means the GTX 1080 Ti.

You used a GPUs that will not be Bnecked by the CPUs in 4K. GTX 1080 Ti - will be Bnecked by the CPUs. It has much more power. GTX 1070 and GTX 980 TI will not be Bnecked by CPUs in 4K gaming. GTX 1080 - will be. Test it for yourself.

Just found this article, sorry for the bad Google English translation: https://translate.google.com/translate?hl=en&sl=de&tl=en&u=http://www.gamestar.de/hardware/praxis/grafikkarten/3311845/gpu_testsystem_2107_p2.html&sandbox=1

Example 'The Witcher 3' (a very GPU demanding game):
They state that going from i7-4770K to i7-7700K with a GTX 1080 Ti increased the fps 16% in 1080p! In WQHD (1440p) performance increase was only 3%. What do you mean will be the difference in 4K? Negligible?

Downscaled to a W3690/X5690, a good assessment would be: An i7-7700K-PC has ~32% more fps in 1080p, ~6% more fps in 1440p, and ~3% more fps in 4K. Negligible.
 
Last edited:
He means the GTX 1080 Ti.



Just found this article, sorry for the bad Google English translation: https://translate.google.com/translate?hl=en&sl=de&tl=en&u=http://www.gamestar.de/hardware/praxis/grafikkarten/3311845/gpu_testsystem_2107_p2.html&sandbox=1

Example 'The Witcher 3' (a very GPU demanding game):
They state that going from i7-4770K to i7-7700K with a GTX 1080 Ti increased the fps 16% in 1080p! In WQHD (1440p) performance increase was only 3%. What do you mean will be the difference in 4K? Negligible?

Downscaled to a W3690/X5690, a good assessment would be: An i7-7700K-PC has ~32% more fps in 1080p, ~6% more fps in 1440p, and ~3% more fps in 4K. Negligible.

I was going to make the point that you don't see CPU Bknecking in resolutions that high. You typically see it in lower resolution games, unless the game is REALLY CPU dependent. Thats the reason the gamers are crayon over the Risen benchmarks in 1080p gaming vs Intel. When you're at the higher resolutions most of the work is rendered on the GPU so the CPU isn't as much of an issue. I don't really understand it, I just know that I've seen some heavy gamers on youtube run a 1080 with an fx8350 with no problems at 1440p and above. 1080p, and there was major stuttering.
 
  • Like
Reactions: Synchro3
Can we get back to topic?

RX 570 and 580's are out, and they are quite a let down. Huge spikes in power consumption, for marginal increase in performance. Only GPU that appear to be actually pretty good is RX 560. Cheap, pretty efficient, and does what it has to do(between GTX 1050 and 1050 Ti performance).
 
Can we get back to topic?

RX 570 and 580's are out, and they are quite a let down. Huge spikes in power consumption, for marginal increase in performance. Only GPU that appear to be actually pretty good is RX 560. Cheap, pretty efficient, and does what it has to do(between GTX 1050 and 1050 Ti performance).

Yep, like the AMD R9 300 series generation, AMD is going to need a new chip to sit at the high end. Lets hope Vega shows up soon and can bring some excitement to the lineup.

I don't really understand AMD's strategy for doing these rebranding generations. Why get a bunch of coverage in the tech press if its going to be "eh, AMD is selling old chips with a new name." I would probably cut them a little slack if Vega was part of the refreshed lineup.
 
Yep, like the AMD R9 300 series generation, AMD is going to need a new chip to sit at the high end. Lets hope Vega shows up soon and can bring some excitement to the lineup.

I don't really understand AMD's strategy for doing these rebranding generations. Why get a bunch of coverage in the tech press if its going to be "eh, AMD is selling old chips with a new name." I would probably cut them a little slack if Vega was part of the refreshed lineup.

The saddest part for me is that these products will end up in Apple systems this year.
 
The saddest part for me is that these products will end up in Apple systems this year.
I am hoping that these rumors of a "pro" iMac and the modular mac pro point to Apple supporting higher end GPUs such as the upcoming Vega. Who knows, maybe all this reassessment of their pro needs will convince Apple to use Nvidia again.
 
Yep, like the AMD R9 300 series generation, AMD is going to need a new chip to sit at the high end. Lets hope Vega shows up soon and can bring some excitement to the lineup.

I don't really understand AMD's strategy for doing these rebranding generations. Why get a bunch of coverage in the tech press if its going to be "eh, AMD is selling old chips with a new name." I would probably cut them a little slack if Vega was part of the refreshed lineup.
Actually, AMD is doing the same thing as Nvidia did few years back. In times of GTX 280, GTX 260. Nvidia had to rebrand 3 times GeForce 8800 GT(X/S) few times, with lower prices, to remain competitive, because of HD4850 and HD4870. And then came HD5870 and HD5850 GPUs and they compete with Nvidia Fermi GPUs. We are talking about lower end market, so the situation is pretty similar. Nvidia rebranded the one of GT 8000 series GPU in GT 9800 and later even is GTS 250, and obviously in OEM GT100 series.

Polaris was supposed to be the lowest cost of development architecture. And they did this, with everything that it comes ;)
The saddest part for me is that these products will end up in Apple systems this year.
This kinda shows, perspective people have about the GPUs. It actually turns out that undervolting, and underclocking, even slightly Polaris GPUs saves a lot of power, while maintaining similar amount of performance.

For example. RX 480, with 1.266 MHz core clock - 163W of power consumption under load.
Radeon Pro WX 7100, the same Polaris 10 XT GPU, with 2304 GCN cores, and core clock of 1.243 MHz:
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9UL0UvNjMzNTA2L29yaWdpbmFsLzA3LVBvd2VyLURyYXctQWxsLVNjZW5lcy5wbmc=

21 MHz difference on core clock - 26W of power saved.
Sapphire Nitro + RX 580 with 1.45 GHz core clock:
power_average.png

Its outrageous...

Also keep in mind guys, that the RX 5XX series are exact reason why Apple is right now "pointing fingers at AMD". Secondly, the GPUs that we have seen so far are based on Polaris 20XTX cores. Polaris 20 XTR are supposed to be golden samples, and maybe they will have outrageous power consumption with higher core clocks - MSI Gaming X GPU based on XTR has TWO 8 pin connectors, they will also declock and undervolt with better results than Polaris 10XT and 20 XTX.
 
Actually, AMD is doing the same thing as Nvidia did few years back. In times of GTX 280, GTX 260. Nvidia had to rebrand 3 times GeForce 8800 GT(X/S) few times, with lower prices, to remain competitive, because of HD4850 and HD4870. And then came HD5870 and HD5850 GPUs and they compete with Nvidia Fermi GPUs. We are talking about lower end market, so the situation is pretty similar. Nvidia rebranded the one of GT 8000 series GPU in GT 9800 and later even is GTS 250, and obviously in OEM GT100 series.

Polaris was supposed to be the lowest cost of development architecture. And they did this, with everything that it comes ;)

This kinda shows, perspective people have about the GPUs. It actually turns out that undervolting, and underclocking, even slightly Polaris GPUs saves a lot of power, while maintaining similar amount of performance.

For example. RX 480, with 1.266 MHz core clock - 163W of power consumption under load.
Radeon Pro WX 7100, the same Polaris 10 XT GPU, with 2304 GCN cores, and core clock of 1.243 MHz:
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9UL0UvNjMzNTA2L29yaWdpbmFsLzA3LVBvd2VyLURyYXctQWxsLVNjZW5lcy5wbmc=

21 MHz difference on core clock - 26W of power saved.
Sapphire Nitro + RX 580 with 1.45 GHz core clock:
power_average.png

Its outrageous...

Also keep in mind guys, that the RX 5XX series are exact reason why Apple is right now "pointing fingers at AMD". Secondly, the GPUs that we have seen so far are based on Polaris 20XTX cores. Polaris 20 XTR are supposed to be golden samples, and maybe they will have outrageous power consumption with higher core clocks - MSI Gaming X GPU based on XTR has TWO 8 pin connectors, they will also declock and undervolt with better results than Polaris 10XT and 20 XTX.

Eh, binning and undervolting to get better efficiency isn't exclusive to AMD. Given Nvidia's better efficiency across their desktop line they probably have more headroom here.
 
Eh, binning and undervolting to get better efficiency isn't exclusive to AMD. Given Nvidia's better efficiency across their desktop line they probably have more headroom here.
That is correct, but what I meant is that it is not as bad, as people are making it to be, at least for Apple case.
 
That is correct, but what I meant is that it is not as bad, as people are making it to be, at least for Apple case.

Huh? It's clear that NVIDIA has a more efficient architecture overall. In order for AMD to compete with NVIDIA at raw performance, they have to crank up their GPUs into much less efficient levels of power. Apple has extremely tight power budgets for their systems. The logical choice would be for them to start with the most efficient architecture, and then bin/adjust voltage from there. Instead, they insist on using the less efficient architecture, and then lower the voltage/power to fit their systems resulting in extremely poor competitive performance. That's why all of this annoys me, I think it's really silly that this year's Apple products will likely all be based on Polaris, given that it's a fundamentally less efficient architecture.
 
Huh? It's clear that NVIDIA has a more efficient architecture overall. In order for AMD to compete with NVIDIA at raw performance, they have to crank up their GPUs into much less efficient levels of power. Apple has extremely tight power budgets for their systems. The logical choice would be for them to start with the most efficient architecture, and then bin/adjust voltage from there. Instead, they insist on using the less efficient architecture, and then lower the voltage/power to fit their systems resulting in extremely poor competitive performance. That's why all of this annoys me, I think it's really silly that this year's Apple products will likely all be based on Polaris, given that it's a fundamentally less efficient architecture.
The architecture is not that less efficient in compute...

In gaming is different story, but I learned that professionals on this forum are valuing efficiency only gaming performance, not compute performance. Even if R9 390X may be on par in compute with GTX 980 Ti, while using similar level of power it will never be considered the same level of efficiency, by this forum experts(and other professional forums, as well), because it is less efficient in gaming.

This is the benefit of using Tile-Based Rasterization.

It is funny that professionals on this forum resist, the simple fact, that Apple cares about compute. Its even funnier, that those people dare to call others out, while not being able to see the broader point of view for AMD.
 
It seems this RX5 is mostly a great way to have a hot GPU.

Especially since it's based on the RX4(80) which I own and sound like a 747 taking off as soon as I enter World of Warcraft... I regret buying the thing, as I was able to play at 2560x1080 at mid level quality with my silent GTX745 at around 90fps yet only manage two more notch on the quality scale at same resolution and fps with the RX480 all the while sounding like a hairdryer.
 
The architecture is not that less efficient in compute...

In gaming is different story, but I learned that professionals on this forum are valuing efficiency only gaming performance, not compute performance. Even if R9 390X may be on par in compute with GTX 980 Ti, while using similar level of power it will never be considered the same level of efficiency, by this forum experts(and other professional forums, as well), because it is less efficient in gaming.

This is the benefit of using Tile-Based Rasterization.

It is funny that professionals on this forum resist, the simple fact, that Apple cares about compute. Its even funnier, that those people dare to call others out, while not being able to see the broader point of view for AMD.

"Professional" can mean a lot of things. Some professionals need graphics/rendering performance. Also, its unfair to say that AMD is better at compute across the board. Some tasks favor AMD, some favor Nvidia.
 
"Professional" can mean a lot of things. Some professionals need graphics/rendering performance. Also, its unfair to say that AMD is better at compute across the board. Some tasks favor AMD, some favor Nvidia.

The software creators have pretty much decided on who's the winner in chosing CUDA over OpenCL in way more pro software package.
 
Especially since it's based on the RX4(80) which I own and sound like a 747 taking off as soon as I enter World of Warcraft... I regret buying the thing, as I was able to play at 2560x1080 at mid level quality with my silent GTX745 at around 90fps yet only manage two more notch on the quality scale at same resolution and fps with the RX480 all the while sounding like a hairdryer.
Its funny, because I have tested in similar situation RX 480, and it was dead silent with 30% GPU load. Whats more, RX 480 in Overwatch, in 1440p, med settings, in a game which strangles GPU much more than World of Warcraft is dead silent, with 50-60% GPU load!

You either lying, or had a faulty GPU.
"Professional" can mean a lot of things. Some professionals need graphics/rendering performance. Also, its unfair to say that AMD is better at compute across the board. Some tasks favor AMD, some favor Nvidia.
Of course, but saying that AMD architectures are bad, not efficient is... "oversimplification", or blatant lying. Or stupidity. Whatever you can call it.
 
Its funny, because I have tested in similar situation RX 480, and it was dead silent with 30% GPU load. Whats more, RX 480 in Overwatch, in 1440p, med settings, in a game which strangles GPU much more than World of Warcraft is dead silent, with 50-60% GPU load!

You either lying, or had a faulty GPU.

Of course, but saying that AMD architectures are bad, not efficient is... "oversimplification", or blatant lying. Or stupidity. Whatever you can call it.

No lie and I still have it. And it seems you are still triggered enough to call me name, which as you should know being that you've memorized this forums rules, is against said rules and could well see you put in a time out. I would stop with this trend if I were you.

The RX480 being hot and noisy is a well documented fact

82419.png
 
Its funny, because I have tested in similar situation RX 480, and it was dead silent with 30% GPU load. Whats more, RX 480 in Overwatch, in 1440p, med settings, in a game which strangles GPU much more than World of Warcraft is dead silent, with 50-60% GPU load!

You either lying, or had a faulty GPU.

Of course, but saying that AMD architectures are bad, not efficient is... "oversimplification", or blatant lying. Or stupidity. Whatever you can call it.

I never said the AMD architecture was bad. I simply said that the NVIDIA architecture is overall more efficient (emphasis on more, to be clear). Sure, there might be specific cases where an OpenCL kernel has been tuned to run really well on the AMD architecture, but the same can be said for NVIDIA as well. However, on average across a wide range of cases, NVIDIA gets better perf/watt. So, I continue to be disappointed that Apple only uses AMD GPUs in their power-limited products.
 
No lie and I still have it. And it seems you are still triggered enough to call me name, which as you should know being that you've memorized this forums rules, is against said rules and could well see you put in a time out. I would stop with this trend if I were you.

The RX480 being hot and noisy is a well documented fact

82419.png
And you have the reference model, or non-reference?

Because I had MSI Gaming X model. The quietest RX 480 of them all, and one of the most silent cooling designs overall.

Examples go from 7:20, sound tests.
 
And you have the reference model, or non-reference?

Because I had MSI Gaming X model. The quietest RX 480 of them all, and one of the most silent cooling designs overall.

Examples go from 7:20, sound tests.

Powercolor RX480 Red Devil.

index.php
 
It turns out, I was right. Red Devil is one of the quietest RX 480 GPUs. You are either lying, or have a faulty GPU.

https://www.computerbase.de/2016-07/powercolor-radeon-rx-480-red-devil-test/4/
Here is the test. In gaming load the GPU averages at 33 dB.

Again, not lying. And again with the insult.
I'm not alone

https://www.reddit.com/r/buildapc/comments/5m8fbl/powercolor_red_devil_rx_480_runs_hot_and_loud/

It maybe the best but it's quite noisy in my system. And you are talking about an average of 33db which mean it'll go from dead silent to a 747 (53db) and average of middle of the two (33db). It doesn't mean that it's at 33db MAX...
 
Again, not lying. And again with the insult.
I'm not alone

https://www.reddit.com/r/buildapc/comments/5m8fbl/powercolor_red_devil_rx_480_runs_hot_and_loud/

It maybe the best but it's quite noisy in my system. And you are talking about an average of 33db which mean it'll go from dead silent to a 747 (53db) and average of middle of the two (33db). It doesn't mean that it's at 33db MAX...
If you will check the review, there is OC mode for this GPU in gaming load, and it stays at 33 dB. You have faulty GPU. Even in the thread, you linked its mentioned.

P.S. The Red Devil will not get to 53 dB. Only blower, reference cooler can get to that level of noise. Aftermarket, non-reference coolers? I have not seen any GPU to get to over 50 dB.
 
If you will check the review, there is OC mode for this GPU in gaming load, and it stays at 33 dB. You have faulty GPU. Even in the thread, you linked its mentioned.

P.S. The Red Devil will not get to 53 dB. Only blower, reference cooler can get to that level of noise. Aftermarket, non-reference coolers? I have not seen any GPU to get to over 50 dB.

33db is an average... Do you know what an average is? Do I have to explain it to you?
And the db scale is a logarithmic one, not a linear one. Which mean a difference of 17db between your average and your max of 50db IS QUITE LOUD!
 
33db is an average... Do you know what an average is? Do I have to explain it to you?
And the db scale is a logarithmic one, not a linear one. Which mean a difference of 17db between your average and your max of 50db IS QUITE LOUD!
The GPU in the same review at desktop is at 28 dB. If you measure the average over 20 minute timespan, the spikes to 50 dB, would have to be very short, to average 33 dB.

Your theory is falling apart. You are too attached to your vision of the world. You have faulty GPU. You can blame only yourself for resisting the reality. Red Devil is not going to 50 dB, regardless of what you will say. Max RPM for the fans in Red Devil is 2300, and at 1600 RPM they are at 33 dB. At 1800, they are at 34.5 dB. So 200 RPM higher, and we have 1.5 dB increase. Maximum sound output for Red Devil is, by simple calculation 38-39 dB, which is in line with other silent versions of coolers for RX 480. It is all in the computerbase review.

Stop this Tuxon. We know you are displeased with your RX 480, but you are now dishonest/lying to readers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.