Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
I wouldn't count on them putting a full fat Vega with 4096 cores in nMP, but who knows. it's either fewer cores at a higher freq or slower but higher count. It should still be needed to halve the power consumption of the full Vega GPU I guess.
 

pat500000

Suspended
Jun 3, 2015
8,523
7,515
There's no chance they would put vega on nMP?
And Mago was right. Well, sort of. :D

http://videocardz.com/59808/amd-vega-gpu-allegedly-pushed-forward-to-october
http://www.3dcenter.org/news/amd-zieht-den-vega-launch-angeblich-auf-oktober-2016-vor
Vega in October? ;)

eSilicon-HBM-Presentation-Slide27.jpg

They did not pushed themselves a lot, did they? ;)
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Pat, nobody, apart from Apple engineering team knows this. Vega parts are already in the wild. TDP is, IF the rumors are true, is on the level of Nano, and the sample boards are also Nano-sized.

Remember when I said that AMD marketing team sucks? :D Well, the rumor about Vega will detract all the attention from exact counter to GP104 and will place expectation very, very low, about those dies :D. So AMD rather than attract the attention to their products, they ultimately gave Nvidia all the spotlight especially for people who would be interested in that level of performance. Even if we do not know what level of performance will Polaris GPUs ultimately have! :D

AMD top management is full of engineers that have completely no idea about marketing. Nvidia's top management is full of marketing and PR people, rather than Engineers. Nvidia knows how to make money, AMD knows how to push industry forward. Their approach of management is seen in success of both brands. And mindshare is key factor driving it.
 
  • Like
Reactions: pat500000

tuxon86

macrumors 65816
May 22, 2012
1,321
477
Pat, nobody, apart from Apple engineering team knows this. Vega parts are already in the wild. TDP is, IF the rumors are true, is on the level of Nano, and the sample boards are also Nano-sized.

Remember when I said that AMD marketing team sucks? :D Well, the rumor about Vega will detract all the attention from exact counter to GP104 and will place expectation very, very low, about those dies :D. So AMD rather than attract the attention to their products, they ultimately gave Nvidia all the spotlight especially for people who would be interested in that level of performance. Even if we do not know what level of performance will Polaris GPUs ultimately have! :D

AMD top management is full of engineers that have completely no idea about marketing. Nvidia's top management is full of marketing and PR people, rather than Engineers. Nvidia knows how to make money, AMD knows how to push industry forward. Their approach of management is seen in success of both brands. And mindshare is key factor driving it.

Here you go again....
 

pat500000

Suspended
Jun 3, 2015
8,523
7,515
Pat, nobody, apart from Apple engineering team knows this. Vega parts are already in the wild. TDP is, IF the rumors are true, is on the level of Nano, and the sample boards are also Nano-sized.

Remember when I said that AMD marketing team sucks? :D Well, the rumor about Vega will detract all the attention from exact counter to GP104 and will place expectation very, very low, about those dies :D. So AMD rather than attract the attention to their products, they ultimately gave Nvidia all the spotlight especially for people who would be interested in that level of performance. Even if we do not know what level of performance will Polaris GPUs ultimately have! :D

AMD top management is full of engineers that have completely no idea about marketing. Nvidia's top management is full of marketing and PR people, rather than Engineers. Nvidia knows how to make money, AMD knows how to push industry forward. Their approach of management is seen in success of both brands. And mindshare is key factor driving it.
.. Lol well there goes that Mac Pro.
 
Last edited:

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Well, both of you exactly proven what I have written in my post. Mindshare of both brands and perception of them has driven your responses.

I suggest waiting for presentation of P10 GPUs with drawing any conclusions.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
P10 will bring double performance to MP and nobody's satisfied with that?

I can understand those who cannot live without Cuda. But then again, nMP is not their machine, and that's it. Hasn't been since late 2013.
 
Last edited:
  • Like
Reactions: hollyhillbilly

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
I sure hope AMD will do well this time, not only to come back up (good for everyone) but also so that some people here will have to swallow their own crap :)
[doublepost=1462911414][/doublepost]z, double performance would be awesome. I sure hope they can deliver, I'm confident they will.
It's gonna be a great machine and the haters here will go somewhere else with their rant.
[doublepost=1462911727][/doublepost]And so that there are no mistakes, Pascal would be great on nMP, but we all know NVidia won't happen so... Polaris it is.
 

MacVidCards

Suspended
Nov 17, 2008
6,096
1,056
Hollywood, CA
P10 will bring double performance to MP and nobody's satisfied with that?

"I'd gladly pay you Tuesday for a hamburger today."

After the last few fiascos I think everyone would rather thank AMD for "double performance to nMP" sometime AFTER they actually ship it.

Something about CEO getting up on stage and telling a pack of lies soured the faith. (Google "overclockers dream")
 
  • Like
Reactions: tuxon86

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
"I'd gladly pay you Tuesday for a hamburger today."

After the last few fiascos I think everyone would rather thank AMD for "double performance to nMP" sometime AFTER they actually ship it.

Something about CEO getting up on stage and telling a pack of lies soured the faith. (Google "overclockers dream")
So far the only one lying is you. It was not CEO claiming Fury X OC'ers dream. But you know, you are an expert on this are. You should know this.
http://www.babeltechreviews.com/overclocking-fury-x-part-2/
Overall, overclocking Fury X has been disappointing to us, and it is not quite the “overclocker’s dream” that AMD’s Richard Huddy suggested that we would see in his preview during E3.

P.S. Wooden screws on Fermi, anyone? Maxwell GPUs presented as Pascal? GTX 970 3.5 GB fiasco? Much more behind his ears have JHH than Lisa Su.
 
Last edited:

Stacc

macrumors 6502a
Jun 22, 2005
888
353
There's no chance they would put vega on nMP?

Its certainly possible, but it is uncertain when this will come out. Given that the current Mac Pro is still on Tahiti, just about anything AMD releases will be a significant upgrade.

I wouldn't count on them putting a full fat Vega with 4096 cores in nMP, but who knows. it's either fewer cores at a higher freq or slower but higher count. It should still be needed to halve the power consumption of the full Vega GPU I guess.

Why wouldn't they? Say they just do a direct shrink of the Fury X, that would put the die size at roughly 300 mm2. Tahiti, currently in the mac pro, is 350 mm2. Since power usage roughly scales with the size of the die its perfectly reasonable we could see something bigger than polaris 10 (if its ~230 mm2) in a mac pro.

And so that there are no mistakes, Pascal would be great on nMP, but we all know NVidia won't happen so... Polaris it is.

I expect that the GTX 1080 is going to crush everything in windows gaming benchmarks. But, why would we or Apple care about this? Historically AMD has had much better performance in OpenGL and OpenCL in OS X. Look at something like Final Cut Pro and AMD still has a big performance advantage here and this will likely continue into the next generation of GPUs.

Take a look at some of the benchmarks over at barefeats. The D700 is beating the Nvidia GTX 980 Ti in some games. Not bad for a 5 year old GPU. So while Nvidia may look good in windows gaming benchmarks, that is not the criteria apple is going to use in selecting a GPU.

The last bit that may be relevant is that the GTX 1080 is a pure gaming chip. There is a reasonable chance that Polaris has fairly good double precision compute performance. Even if Polaris loses to the 1080 in gaming benchmarks it may win the compute benchmarks. Of course we will know a lot more in a month.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Pascal is still no were. It's promised for a few rich kids as a founders edition. And it's Nvidia's' first chip that can do something else than Cuda. DX12 and coin maining showed a long time ago, that emperor didn't have clothes. MVC was cursing in async computing name just couple of days before Nvidia's PR stunt, and now it's Pascal's main new feature that will do miracles. What a difference does a night make.

Even the almighty Titan X couldn't match Fury Nano in computing.. and the latter was considered as a gaming card, Titan X not. Nvidia is popular outside gaming just because they have Cuda. A software. The hardware has been aimed mainly for fps shooters and others who had a need for a virtual manhood extension.

AMD got its reputation as a world heater because it tried to catch the gamers needs, were Nvidia's GPUs' are dominating. They had to increase clocks over good perf/watt numbers and just let it burn. But make them do the real work, GPGPU, and undress the Cuda, and they are tie in perf/watt.. or as it is with Fury Nano, Titan X is left at the traffic lights heating the neighborhood. I think this was just an example that Nvidia has a better marketing team. If AMD had release first Fury Nano, and wouldn't have marketed it as a HTPC miracle with a bit too much price, they'd have kept their faces, but I suppose it was a marketing decision that they started with Fury X. So, their loss.

But as mentioned before, it's the software were Apple and AMD have both stumbled.. there just aren't a lot of support for openCL or dual GPU combination. I'm sure Apple knows this, but when will they fix it? Next change is the upcoming OS.

And final words; I don't consider AMD the best thing in the world. I just feel sympathy to those who are small but try to fight in competing against the bosses and their paid bullies.
 
Last edited:
  • Like
Reactions: ShadovvMoon

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
AMD got its reputation as a world heater because it tried to catch the gamers needs, were Nvidia was dominating. But make them do the real work, GPGPU, and undress the Cuda, and they are tie.. or as is with Fury Nano, Titan X is left at the traffic lights heating the neighborhood.
Nope, because their reference blower sucked. Not air, it just sucked. It happened after R9 290 and 290X.

R9 390 and 390X did not had this problem. It used even less(4W, but it was a difference ;) ) power than predecessors despite having more RAM, and higher core clocks and higher memory clocks.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Ok, I was referring to perf/watt.. but yes.
Performance per watt? It was on the same level as Nvidia. R9 390X and GTX 980 TI have similar thermal envelope and compute performance. It was software that blocked AMG GPUs in games and from that came mindshare of Nvidia GPUs, about performance per watt.

Look what happened lately when software caught up. R9 380X is on the level of GTX 980, R9 390X is on level of GTX 980 Ti. And in some cases performance per watt was much better than Nvidia's offerings(Fiji, overall).

People are staggered when R9 390X is much faster in OpenCL than GTX 980. But it has to be! It has almost 6 TFLOPs of compute power, compared to 4.6. Again, mindshare was the key role of driving perception of AMD GPUs.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Performance per watt? It was on the same level as Nvidia. R9 390X and GTX 980 TI have similar thermal envelope and compute performance. It was software that blocked AMG GPUs in games and from that came mindshare of Nvidia GPUs, about performance per watt.

Look what happened lately when software caught up. R9 380X is on the level of GTX 980, R9 390X is on level of GTX 980 Ti. And in some cases performance per watt was much better than Nvidia's offerings(Fiji, overall).

People are staggered when R9 390X is much faster in OpenCL than GTX 980. But it has to be! It has almost 6 TFLOPs of compute power, compared to 4.6. Again, mindshare was the key role of driving perception of AMD GPUs.
I think I need to take English lessons, because that's what I basically said and you said it again.
  • GPGPU wise Nvidia was playing in the kindergarten team
  • Software side AMD's motorbike didn't have chains
 

MacVidCards

Suspended
Nov 17, 2008
6,096
1,056
Hollywood, CA
Performance per watt? It was on the same level as Nvidia. R9 390X and GTX 980 TI have similar thermal envelope and compute performance. It was software that blocked AMG GPUs in games and from that came mindshare of Nvidia GPUs, about performance per watt.

OK, you just moved from AMD PR cheerleader into outright lying
Screen Shot 2016-05-10 at 3.10.27 PM.png
.

Have a look:

https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/31.html

And who pulls up the rear NEARLY EVERY SINGLE LOSING CARD IS AN AMD.

Get back to the PR script, they know where their strengths are, perf per watt isn't one of them.
 
  • Like
Reactions: tuxon86

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
OK, you just moved from AMD PR cheerleader into outright lying View attachment 630819 .

Have a look:

https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/31.html

And who pulls up the rear NEARLY EVERY SINGLE LOSING CARD IS AN AMD.

Get back to the PR script, they know where their strengths are, perf per watt isn't one of them.
LOLOLOLOLOLOL You have just proven me right and brought argument to what I have written. Reading comprehension.
IF AMD GPU is bottlenecked by software it will show lackluster performance. If Software is not bottlenecking AMD GPU it will be on the same level as Nvidia. GTX 980 Ti has 6 TFLOPs of compute power and is within 250W TDP. R9 390X has around 6 TFLOPs of compute and has 250W of TDP. The only one who is doing PR is you. You not even understand what is written in posts.

What is funnier in the context, you used as your example overvolted, and overclocked version of this GPU. Search for stock GPU. Like this one: http://tpucdn.com/reviews/Powercolor/R9_390_PCS_Plus/images/power_average.gif
How come GPU with 10 MHz OC on core uses 4W less power than previous version of that GPU while having twice the RAM, and much higher core clock, and Memory clock?
Edit: http://tpucdn.com/reviews/Powercolor/R9_390_PCS_Plus/images/power_peak.gif Here is even 10W less.
I know you want to create PR here, but, please, be factual. You use gaming, DX11 benchmarks as a factor of AMD performance per watt, without even understanding the performance of the GPUs. Either ignorant, or ...
 
Last edited:
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.