Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
He keeps telling others to find sources for yourselves because there isn't one lol. Just let him keep on doing his AMD propaganda thing, at least they are fun to read sometimes.
 
GTX1080 in SLI in Clevo laptop, not as a standard MXM configuration either.
Seems no one is adhering to standards nowadays.
GDDR5X, a first in mobile I guess.
NVidia doesn't even differentiate mobile versions from desktop anymore, sort of. Same specs for the 1080, only clocks will vary.
10x0 is a mess right now, when it comes to specs. 1060 for laptops is only clocked slower than desktop, 1070 has different specs, 1080 only differs in clocks, faster for desktop as well. This will bring confusion to the people less informed.

I don't think you're too far off koyoot. Just wait a while and behold, everyone's going the nMP way.
 
Last edited:
http://www.guru3d.com/news-story/amd-vega-launch-imminent.html

Thank you. Other information that I have provided are also completely correct.

First reason for it: World is not everywhere as rich as USA, Germany, Great Britain, France, Australia. There are places where this GPU is 1/2 of monthly salary, like it is in... Poland. Base monthly Salary here is 1850 PLN. The GPU costs currently 1250 PLN.

Secondly, AMD GPUs after few months are completely free with mining. And miners are buying not 1-2 GPUs, but 6-8 GPUs in one throw.

And to be completely clear. GTX 1080 is the absolutely best selling high-end GPU in history. In just two months there has been sold as many GTX 1080's as Titan X's and GTX 980 Ti's in 5 months, combined.

And one last bit, comparison, clock for clock with different versions of the same architecture and the performance gains. It is important if you want to know how much faster base model Mac Pro with Dual Polaris 10 Pro will be faster from Dual Tahiti XT(D700) at the same clock.

https://translate.google.com/transl...n-polaris-architektur-performance/&edit-text=
 
http://www.guru3d.com/news-story/amd-vega-launch-imminent.html

Thank you. Other information that I have provided are also completely correct.

First reason for it: World is not everywhere as rich as USA, Germany, Great Britain, France, Australia. There are places where this GPU is 1/2 of monthly salary, like it is in... Poland. Base monthly Salary here is 1850 PLN. The GPU costs currently 1250 PLN.

Secondly, AMD GPUs after few months are completely free with mining. And miners are buying not 1-2 GPUs, but 6-8 GPUs in one throw.

And to be completely clear. GTX 1080 is the absolutely best selling high-end GPU in history. In just two months there has been sold as many GTX 1080's as Titan X's and GTX 980 Ti's in 5 months, combined.

And one last bit, comparison, clock for clock with different versions of the same architecture and the performance gains. It is important if you want to know how much faster base model Mac Pro with Dual Polaris 10 Pro will be faster from Dual Tahiti XT(D700) at the same clock.

https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=cs&ie=UTF-8&u=https://www.computerbase.de/2016-08/amd-radeon-polaris-architektur-performance/&edit-text=
Thank for these links. It seems, that Polaris 10 would have benefited from GDDR5/X, but maybe it was too expensive even for 480X and could have eaten AMD's margins.

Vega with HBM2.. now that is going to be interesting.
 
Last edited:
One more thing about 10 nm processes: https://www.semiwiki.com/forum/content/6083-semicon-west-globalfoundries-update.html?

Gary confirmed GlobalFoundries will not be offering a 10nm process.

Same thing will happen exactly to TSMC. 10 nm process is not designed for high performance parts. Only mobile. That is why Nvidia is skipping 10 nm, and pushes back Volta to 14/16 nm. That is why Intel is adding another 14nm CPU revision before 10 nm CPUs.

But I have written most of this already...
 
Thanx koyoot. Here too GPUs are a big chunk of monthly wages. :-(
Like I said before, and ranted a bit about it, GDDR5X should come to Polaris, RX490 maybe?!
I don't think they'll use 490 for Vega, that would limit the number of cards to be released. Unless it's just one SKU which I doubt. But the fact is that RX490/495 was referred to has having >256b mem bus width, which is not the case of Polaris 10 as we've had confirmation.
Odd...
Vega should become something else apart from 400 series, like Fury was. Maybe now Rage? :)

It's everywhere now, even here where we don't like to refer to:
http://wccftech.com/amd-vega-flagship-gpu-launch-teaser/




I guess the successor to the D700 might already be in the works or almost ready...
Hope they don't screw up.
 
Last edited:
RX 490 Q4 2016.
RX Fury Q1 2017.

Both GPUs are using Vega architecture. And the bigger one has 4096 GCN cores.
 
So maybe RX 480X uses GDDR5/X, and moves some watts saved from memory to GPU.
No. There is no GDDR5X memory controller in Polaris 10 die. It could be implemented in future, however, that would mean redesigning of the silicon. Which would consume another 60 mln USD, and 12 months, at least.
 
So, my guess was true after all, no GDDR5X controller on the Polaris die. Had to be, otherwise AMD would tout it.
Where did you get the info on the new cards? Probably can't say but is it any reliable?
That being true it's odd, releasing 2 similar cards (same family) in 2 different brands/series. 400 and Fury, again?!
Although Fiji was not that far off from Tonga and were also 300 and Fury. But different still, unlike now.
 
So, my guess was true after all, no GDDR5X controller on the Polaris die. Had to be, otherwise AMD would tout it.
Where did you get the info on the new cards? Probably can't say but is it any reliable?
That being true it's odd, releasing 2 similar cards (same family) in 2 different brands/series. 400 and Fury, again?!
Although Fiji was not that far off from Tonga and were also 300 and Fury. But different still, unlike now.
It is perfectly understandable. Lisa Su have said that Vega is going out within upcoming two quarters. RX 490 - this year. RX Fury - Q1 2017. Bigger GPU is the one with 4096 GCN cores. Do not expect from that GPU anything less than 899-999$ price tag, and I am pretty sure we will not see the big Vega in Mac Pro. AMD went simply with simple naming scheme for their GPUs, and took out some tricks from Nvidia marketing: "normal" lineup - up to GTX 1080, with Halo product on top of it: Titan X.

This is just my prediction for Mac Pro:
Radeon Pro DX300 - Polaris 10 Pro
Radeon Pro DX500 - Polaris 10 XT
Radeon Pro DX700 - Vega "whatever its number is called". Core count higher than RX 480, but lower than 4096 GCN cores.
 
Sounds reasonable.
And maps right into nMP, being the DX700 a 3072 core GPU and HBM2. Beast of a GPU for who really needs it, Polaris for the rest that can settle for less. Nice.
But DX300 should come with slower mem as well?
That would be 470, 480 and 490 - nice fit!!
 
And can fit perfectly in the price tag scheme: Tier one - base model, Tier two - +400$, Tier 3 +1000$ from base model.
 
Regardless of when the 4096 core vega part comes out I wonder if it will be good enough to beat the gtx 1080. The RX 470 is only 15% faster than the r9 380x and they both have the same number of gcn cores. If vega is only 15% faster than fury X it will lost handily to nvidias gtx 1080 that will have been out for 6+ months.
 
No idea.
Regardless of when the 4096 core vega part comes out I wonder if it will be good enough to beat the gtx 1080. The RX 470 is only 15% faster than the r9 380x and they both have the same number of gcn cores. If vega is only 15% faster than fury X it will lost handily to nvidias gtx 1080 that will have been out for 6+ months.
You do not take into account higher bandwidth of memory, next gen. Graphics IP. Vega is slightly different architecture. Also you look at DX11 games where everything that is important right now is Compute and DX12 games(if gaming is important to you).
 
Last edited:
DP was a hot topic a while back, now it's not needed anymore?
All people do now is AI and deep learning with the new INT8 type?

Right, people like to talk smack about how NVIDIA limits the DP performance in their consumer (i.e. GeForce) products. Can anyone name an application that needs lots of DP performance? Clearly games do not, and video apps like FCP and the Adobe apps don't seem to need it either.
 
  • Like
Reactions: AidenShaw
DP was a hot topic a while back, now it's not needed anymore?
All people do now is AI and deep learning with the new INT8 type?
It was before important for some percentage of market, but as time goes by it is getting less and less, because things that can be done with FP16 and FP32 are phasing out FP64. Right now 95% of market is still FP32, FP16 is a nieche still, however it will grow extremely rapidly(even it will be important for DX12 and Vulkan games), and FP64 as time will go by, will be pushed into just small margin of applications.
 
because things that can be done with FP16 and FP32 are phasing out FP64
Allways the same things that require FP64 can be done on a GPU w/o FP64 by just an simple compiler optmization, but running a lot of FP64 operations on an FP32 systems requires 20x more fp32 instructions plus a number of memory registers so hopefully you can get 1:40 FP64 on an FP32-only system.

FP64 maybe not necessary on AI and non HDR video rendering but lots of SCI/ENG apps requires it, as some HI Color HDR rendering too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.