Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
YOf course you could argue that not all tasks can utilize the dual GPUs but anything properly coded on OS X to use OpenCL or Metal should have no problem with this. The nice thing about compute tasks for the GPU is that they are already embarrassing parallel so throwing more GPUs at the problem is a perfectly fine solution

For compute that's perfectly fine. But if you have a graphics API heavy workflow, dual GPUs are a problem. Even with Crossfire they don't scale well.

A lot of people have problems considering the dual GPUs as a single larger GPU when there is almost no OS X software that taps multiple GPUs besides stuff like Final Cut.
[doublepost=1463293927][/doublepost]
if a 7_1 is shown, there will be much wailing here.
if it's not shown, there will be much wailing here.

either way, prepare for much wailing here.

Oh if I can't buy the new Mac Pro in an Outrigger case I'm going to throw a fit. Just another example of Apple not listening to their customers and abandoning long time markets!
 
  • Like
Reactions: pat500000

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
For compute that's perfectly fine. But if you have a graphics API heavy workflow, dual GPUs are a problem. Even with Crossfire they don't scale well.

A lot of people have problems considering the dual GPUs as a single larger GPU when there is almost no OS X software that taps multiple GPUs besides stuff like Final Cut.
Start asking developers to utilize it properly. It is like blaming the hardware for what software cannot do. Isn't it the other way around how it should be?
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
hqhmU.png


By the looks of things, 1680v4 may have around 1000$ price tag. Or rather Intel will move the tier and naming scheme. So 8 core may now be 1660v4, and 10 core 1680v4 compared to 1660v3 - 6 core, 1680v3 - 8 core.
 
Last edited:

Stacc

macrumors 6502a
Jun 22, 2005
888
353
Start asking developers to utilize it properly. It is like blaming the hardware for what software cannot do. Isn't it the other way around how it should be?

Apple could do a better job at keeping OpenGL up to date and supporting cross platform APIs like Vulkan. While I don't see Apple coming out with any VR hardware any time soon, perhaps the mere existence of it will spur some much needed attention on graphics in OS X. Metal is nice, but so few macs are capable of running high end games and VR experiences its tough for many developers to focus on Mac specific APIs.

hqhmU.png


By the looks of things, 1680v4 may have around 1000$ price tag. Or rather Intel will move the tier and naming scheme. So 8 core may now be 1660v4, and 10 core 1680v4 compared to 1660v3 - 6 core, 1680v3 - 8 core.

Looks good. Similar prices as Haswell-E with a small clock bump. I'm looking forward to seeing the benchmarks on these.
 
  • Like
Reactions: Mago

antonis

macrumors 68020
Jun 10, 2011
2,085
1,009
As an ex-owner of nMP, I should say that crossfire is not even available on OS X (nothing can be done from the devs on that point, OS X drivers are completely lacking this feature) while on the windows side is - at best - a hit and miss feature and the results vary per case as you may get any of the following:
- Have double the performance compared to single gpu usage
- Have the exact same performance with single gpu usage
- Have lower performance compared to single gpu usage
- Crashes

Having said that, though, crossfire is a hit and miss on the PC side as well. I have the impression that SLI is - generally - in a better shape, but I might be wrong on this one.
 
Last edited:

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Mago, I hope you were not referring to me as an Apple hardware hater. Maybe the article's author?
If you read earlier posts you'll see I'm no hater, much on the contrary. I started this thread with my hopes up.
There are people here that are indeed haters, not me.
I'm hoping to see a new nMP at WWDC with Polaris, it's been a long wait, and I'll get one as soon as it comes out.
Cheers...
 
  • Like
Reactions: FaithfulCalebTN

pat500000

Suspended
Jun 3, 2015
8,523
7,515
Mago, I hope you were not referring to me as an Apple hardware hater. Maybe the article's author?
If you read earlier posts you'll see I'm no hater, much on the contrary. I started this thread with my hopes up.
There are people here that are indeed haters, not me.
I'm hoping to see a new nMP at WWDC with Polaris, it's been a long wait, and I'll get one as soon as it comes out.
Cheers...
I don't think so...more like the article.
 

askunk

macrumors 6502a
Oct 12, 2011
547
430
London
Let me understand something that it's not clear to me. Polaris will be a "stripped down" version of Vega, both at 14nm.
I have read many people hoping that it will be used in MBP, other than in the upcoming nMP...
Isn't the point of a Pro machine to have - maybe not the fastest on Earth, but still - a powerful (Pro) video card?

I do get that the present MP has a thermal limit, but it is kind of sad (if I got it right) that we are hoping to get a "mobile" video card (such as the iMac), more than a Pro one.
Shouldn't we hope for better cards, aside the fact that the new "mobile" cards are still more powerful than the older Pro installed in the present MP?

Thanks to anyone who could explain it to me! :rolleyes:
 
  • Like
Reactions: MacVidCards

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
There are certain sweetspots for every CPU/GPU architecture. Polaris 10 seems to be designed to work best at 110-130W TDP. I am sure it is possible to overclock it and run it at 180W or more, but the performance per watt gets worse.

This must have been one of the reasons why nMP has two low power GPU's instead of one high power GPU. It gets better GPGPU results that way and keeps the thermal requirements still low.

It could be that GTX1080's' best perf/watt ratio is somewhere at 150W, but Nvidia wanted it to be the fastest card available, so they give it more boost in the expense of power efficiency. Still, what is can do with 180W is excellent.

Polaris 11 is 50W mobile focused GPU. I am sure it can be run between 35W-80W TDP, but at 50W it has the best perf/watt ratio.

Most Intel processors seem to be efficient up to 20W per core and after that TDP goes up more quickly than speed increases - efficiency gets worse.

Apple's vision was to make a quiet and relatively small desktop computer. In 2013 it was the fastest machine you could make for 450W category in terms of GPGPU computing. In 2016 things might have change some, and it could be that they'd get better result if they updated it to 500-550W machine but so far we really don't know the true colors of the latest GPU's.. but within month we should know.
 
Last edited:
  • Like
Reactions: askunk

Mago

macrumors 68030
Aug 16, 2011
2,789
912
Beyond the Thunderdome
Mago, I hope you were not referring to me as an Apple hardware hater. Maybe the article's author?
If you read earlier posts you'll see I'm no hater, much on the contrary. I started this thread with my hopes up.
There are people here that are indeed haters, not me.
I'm hoping to see a new nMP at WWDC with Polaris, it's been a long wait, and I'll get one as soon as it comes out.
Cheers...

The Article's author, take it easy.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
AMD's Fiji is an excellent education material where performance per watt calculation go wrong. It was evidently designed to be 150-175W TDP GPU, but then it became too expensive to make and yields were poor, so they (marketing dept) had to make it as fast as possible in order to justify its high price... But going from 175W --> 275W gave it just 10% more compute power. >50% increase in watt/h and 10% speed gain.. If AMD had went 150-175W from the beginning, their reputation would have been very different. But that's how PR/marketing-dept works... and bad luck.
 
Last edited:

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Yields were not bad. It is manufacturing almost 600mm2 die onto interposer and thermal problems it brings that are difficult. It is exactly the same problem Nvidia has with GP100. Its all jumping around how material works with thermals, and especially: different temperatures of different parts of the same unit. One part can be warmer than another and that causes huge problems with interposer, with die itself, with PCB that its layered on.
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,038
760
West coast, Finland
Yields were not bad.

What ever the reason, there was this problem: http://www.hardwareluxx.com/index.p...pments-substantially-reduced-to-partners.html

At AMD, the engineers point of view was that they had an awesome product (Fiji at 150-175W TDP) and there was some overclocking potential. The marketing team was thinking differently. They needed better price, so they overclocked the product themselves and sold it as Fury and Fury X.

I think that Nvidias "Founders Edition" is their way to answer the short supply problem.
 
Last edited:

Miguel Cunha

macrumors 6502
Sep 14, 2012
389
102
Braga, Portugal
hqhmU.png


By the looks of things, 1680v4 may have around 1000$ price tag. Or rather Intel will move the tier and naming scheme. So 8 core may now be 1660v4, and 10 core 1680v4 compared to 1660v3 - 6 core, 1680v3 - 8 core.
Are these Xeon CPUs?
As far as I know Core i7 series CPUs are used in iMacs.
Of course, these ones have loads of cache memory, clock speeds and core number, in par with current MacPros.
Did Intel change the naming scheme?
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
No. Those are Broadwell-E from High End Desktop component line-up. They will not appear in Mac Pro or even in iMac. I posted them to show indications of Broadwell EP Xeon E5 pricing and core configs, for 16XXv4 CPUs. Both of them are basically the same, however, Xeon CPU do not have OC'ability. But core configs, clocks, TDP's are the same. Even pricing matches specific versions of CPUs.
 

ManuelGomes

macrumors 68000
Original poster
Dec 4, 2014
1,617
354
Aveiro, Portugal
Hi Miguel. Nice to see a countryman. Those are not Xeon, but -E HEDT chips. Xeon are -EP, although they share some features.
Sorry koyoot, you replied already. Failed to see it.

Mago, no sweat. I was just making sure. :)
[doublepost=1463424706][/doublepost]OS X 10.11.5 final build available.
[doublepost=1463425093][/doublepost]1080 features leaked:
http://videocardz.com/59962/nvidia-geforce-gtx-1080-final-specifications-and-launch-presentation
[doublepost=1463425536][/doublepost]And some OpenCL benches:
http://videocardz.com/59951/nvidia-geforce-gtx-1080-opencl-performance-leaked
 

Larry-K

macrumors 68000
Jun 28, 2011
1,908
2,363
Apple's vision was to make a quiet and relatively small desktop computer. In 2013 it was the fastest machine you could make for 450W category in terms of GPGPU computing. In 2016 things might have change some, and it could be that they'd get better result if they updated it to 500-550W machine but so far we really don't know the true colors of the latest GPU's.. but within month we should know.
And we all know that every discerning workstation user, always insists on using only 450 watts and having dual, crippled GPUs in a 3Lb Coffee-Can sized form factor(like Folger's, not the good stuff), that only work in a handful of applications; and, of course, these physical limitations have to cost twice as much, as the previous model that wasn't hamstrung by this vision.

Where did this category come from? I need glasses, but my vision uncorrected is nowhere near that bad.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
And we all know that every discerning workstation user, always insists on using only 450 watts and having dual, crippled GPUs in a 3Lb Coffee-Can sized form factor(like Folger's, not the good stuff), that only work in a handful of applications; and, of course, these physical limitations have to cost twice as much, as the previous model that wasn't hamstrung by this vision.

Where did this category come from? I need glasses, but my vision uncorrected is nowhere near that bad.
https://forums.macrumors.com/threads/new-mac-pro-at-wwdc-june-2016.1971061/page-6#post-22903676

If you look into the past - then yes, you will not see it.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
You mean back when Apple made computers that could actually do the job?

Yeah, that was the past.
Have you read that post in that link? Or all you can do is complain about how awful Apple is with their decisions?
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
Have you read that post in that link? Or all you can do is complain about how awful Apple is with their decisions?
Maybe he thought that the post was an attempt at rationalization - especially since the biggest GPU news of the last five weeks concerns a 300 watt GPU, and a small server containing eight of them mounted internally.... :rolleyes:
 
  • Like
Reactions: tuxon86

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Maybe he thought that the post was an attempt at rationalization - especially since the biggest GPU news of the last five weeks concerns a 300 watt GPU, and a small server containing eight of them mounted internally.... :rolleyes:
It is last node where 600mm2 die size is affordable, which I already have mentioned in previous posts...
Yes, I read the post.

So you're saying that Apple's bad decisions from 2013, may be a good idea in 2018?

That would be great if I was taking a 5-year vacation.
It is only up top anyone to think that way. I tried to educate myself about future of computing, and where industry goes. And for where the world goes, Mac Pro in trash can form factor is not bad idea. For where it was yesterday - it was a bad idea. Right now we are in middle ground. I am not saying that it is best design for where it goes, somebody may come up with even better idea, but Mac Pro was the first.

3 key things to consider here: efficiency, scalability of performance, HSA foundation.
 
Last edited:
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.