Just put two GTX 1070 in the new nMP and everybody will be happy.
Just put two GTX 1070 in the new nMP and everybody will be happy.
Now that we've nearly hit a wall in Moore's law, partial or incremental updates are unlikely to result in significant gains.
Is it always totally honest to sum up the computing power of 2 weaker gpus as an excuse against a single more powerfull one ?
Remember that crossfire is a directx11 method of driving graphics with dual gpus. Opencl is perfectly capable of taking advantage of dual gpus in compute workloads.What percent of the pro-oriented applications can they really take advantage of dual gpus (considering that OS X doesn't support crossfire) ? Is it always totally honest to sum up the computing power of 2 weaker gpus as an excuse against a single more powerfull one ?
Moore's law is slowing, but GPUs are farther behind CPUs, and CPUs are still a ways from hitting a "wall".
Guys, it is like turning the table to take a meal that is on the other side, rather than going around the table to take it.
What percent of the pro-oriented applications can they really take advantage of dual gpus (considering that OS X doesn't support crossfire) ? Is it always totally honest to sum up the computing power of 2 weaker gpus as an excuse against a single more powerfull one ?
As you mentioned earlier about the last min leak.. It sounds like 500d model would be a better buy.. Wouldn't it?I still think the GPU apple will not strictly attach to feature size, than actual performance/tdp.
Polaris are Just Ok replacement for D300/D500 a D700 either requires a card more targeted at compute than Pixel rendering, My wish list is it to be an AMD Vega based GPU (not available until Q4 as earliest, Q1'17 more realistic), what Apple will do? delay for ever a nMP or release it now with Polaris and maybe an decent Compute GPU as W8100? I'll opt for W8100 despite being a 28nm gpu or not G700 option yet, leave it empty until Q1'17 and fill with AMD Vega.
Leakers... I don't know why they insist the MP to include dual Fury Nano based GPU, the fury nano is offered at upto 4GB ram, but AMD can offer it at 8GB Ram the parts are available, Personally I don't like the dual fury nano setup, I'll go for D500 if based on Polaris.
Maybe Apple considers VR and AI more important than compute, so they don't care on put a good WS GPU on the Mac Pro, this is what I fear on leakers insistence on Dual Fury as G700 GPU, and Actually isn't the end, as I previously write, is more Likely I'll go for an Dual Polaris, and It's very Likely I''l get later and nVidia eGPU.
Tomorrow we decide...As you mentioned earlier about the last min leak.. It sounds like 500d model would be a better buy.. Wouldn't it?
That tomorrow is killing me.
Oh I ain't watching too due to time... I will eventually. I want to watch it so that hopefully someone at wwdc could yell z1 workstation at Tim and Phil...see their red faces brighten up like Rudolph the red nose reindeers.One thing is for sure: I'm not going to watch a 2 hour keynote only to be disappointed. In other words, I'm not going to watch it! I'll check in after it's over.
I'm not going to watch a 2 hour keynote only to be disappointed. In other words, I'm not going to watch it! I'll check in after it's over.
Hopefully that rumor comes through.Actually the best value part of the keynote is that related to new API, Swift 3, really worth.
And is wise to be ready to be disappointed again by Apple... Aummmmmmm Aummmmm Aummmmmmmmmmmmmmmm
Remember that crossfire is a directx11 method of driving graphics with dual gpus. Opencl is perfectly capable of taking advantage of dual gpus in compute workloads.
Not if both cards exceed the power of the single one. The D700's were more powerful than any single card during that time. The dual D700 were even faster than a single Titan X that was released two years later. The D300-D500 were for different price points and needs.
Except those who want high-performance compute instead of gaming. The former is more aligned with the Pro than the latter.
This.Oh FFS.
I'd happily pay more money for a GPU aimed at gaming instead of high performance compute.
I'm not even going to contest it. As an engineer that gets to use neat hardware and software to make pretty cool stuff, I want to buy a Mac Pro to do some of that.. and play games on.