Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
AMD-Vega-Die-Shot.jpg

http://creators.radeon.com/radeon-pro-vega/
The die size appears to be around 490mm2, and it appears AMD hints 8 Shader Engines. If anyone would remember my post from very long time ago in this very thread, you would know that this is exactly what I have told you.

With exact link for those who require this: https://forums.macrumors.com/thread...s-announcements.1975249/page-40#post-23622836

Post from September last year.
 
I have to answer that post myself. I got more specific information.

There are 4 shader engines, partitioned into two compute clusters with 8 CU's each. They should behave differently than before. It is supposed to increase efficiency, because you can power down unused clusters. It also allows for better load balancing, like AMD has touted in their technical announcements about Vega. Secondly, they will behave like 8 shader Engines. Thirdly, It has not been specified to me, but was hinted that potentially those 8 CU's have the same amount of resources available as 16 CU's in Fiji, so Throughput could've been increased this way.
 
More and more stuff is getting out.

https://seekingalpha.com/article/40...-stifel-nicolaus-2017-technology-internet-and

"In the in the Ryzen, Ryzen processor is just as an example, we've launched seven versions of the Ryzen processor to-date and if you look at those seven versions relative to their Intel competitor that's most nearly priced competitor you'll see we're bringing anywhere from 30%, sometimes over 70% performance advantage. And when we launch our Vega graphics processor that's coming very shortly, it will come later this month, you'll see similar level of competitive performance in high-end part of the desktop or high-end part of the graphics market."

Very bold statement mr. Anderson.

JUst a side note. Ethereum was few days ago @190$. Today its 260$, and Bitcoin was loosing its value, but now it went up to 2900$. Mining craze - here we go...

Its just like gold craze couple centuries ago...
 
Last edited:
There's a rumor of AMD and NVIDIA coming out with cheaper cards without video ports.

I hope they will have heavy duty fans and work well in multi GPU for when the miners dump them.
 
As long as AMD cards keep getting snapped up and price-inflated by bitcoin weirdos, and as long as I need to disable SIP and do kext-editing to make them work, I'll be going with the nVidia option.
 
There's a rumor of AMD and NVIDIA coming out with cheaper cards without video ports.

I hope they will have heavy duty fans and work well in multi GPU for when the miners dump them.
Its not a rumor. AMD and Nvidia are doing this to avoid price inflation.
 
I have to answer that post myself. I got more specific information.

There are 4 shader engines, partitioned into two compute clusters with 8 CU's each. They should behave differently than before. It is supposed to increase efficiency, because you can power down unused clusters. It also allows for better load balancing, like AMD has touted in their technical announcements about Vega. Secondly, they will behave like 8 shader Engines. Thirdly, It has not been specified to me, but was hinted that potentially those 8 CU's have the same amount of resources available as 16 CU's in Fiji, so Throughput could've been increased this way.
Back to this topic.

There are 4 Shader Engines partitioned into two Compute Clusters, which both have the exact same amount of resources as 16 CU's in Fiji. Each Compute Cluster has its own, independent Geometry Engine. Everything will behave like 8 Shader Engines. Performance of Geometry Engine is tied directly to the clock of the GPU.

AMD claimed that Vega has 2.7 times higher Geometry performance, vs Fiji in 4 vs 4 Shader Engine layout. This explains how they achieved this.
 
As in "secured a contract for future deliveries" - the iMac Xeon isn't available for seven months.
Because it is waiting for Xeon CPUs to be released by Intel...

Jeanlain was talking about Radeon Pro 580, and the rest of GPUs in iMacs that are available today.
 
Whatever you do don't join this sudden rush to buy up GPUs for mining Ether. The time for GPU mining was in the last 2 years when you could mine an Ether every day. Now a powerful GPU is can't get an Ether in a month.

The difficulty of mining now is increasing at such a rate that the profits are reduced every month. You can only maintain high profits by spending more and more on mining rigs. It becomes like a pyramid scheme where you spend more and more to make less and less.

Mining difficulty rate is one thing. Then there is the issue that the value of any coin can collapse at any moment because they are not based on real product. They can stolen, hacked, lost in bugs and even become impossible to cash out at a thieving unregulated exchange (happens every week).

The best thing to do is have use your SLI/Crossfire rigs for mining when you aren't doing something productive. Then there's no risk.

As for the other coins, 98% of them are a scam and their founders even call each other scammers on Twitter, either as a joke or seriously or to manipulate the market. Look at fights between heads of Monero with others.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.