GPUs are good at code without inter-dependent state and conditional flow structures. CPUs are better when control flow is complex, and parallelism is weak.
Think of a GPU as a collection of tiny, underpowered CPUs.
I think that I said exactly that, but from the opposite viewpoint - if you don't have massive parallelism, a GPU might not be a good fit. If you have data dependencies, you often have limited parallelism. If you have complex control flows, GPUs might technically be able to be used - but CPUs are faster.A lot of that information looks pretty old, for example GPUs have had support for conditional flow control for over a decade at this point. It might still have a performance penalty if there is a lot of divergence between the threads in a warp (or the AMD equivalent), but it's not enough to make a GPU a bad target for a massively parallel program. The sheer number of execution cores in a GPU means it'll always be a better fit for tasks with a lot of parallelism.
I think that I said exactly that, but from the opposite viewpoint - if you don't have massive parallelism, a GPU might not be a good fit. If you have data dependencies, you often have limited parallelism. If you have complex control flows, GPUs might technically be able to be used - but CPUs are faster.
And "having conditional flow control" and "having efficient conditional flow control" are two different things. A GPU might be good for streams that are mostly sequential with a small number of branches, but fail to scale for streams with lots of conditionals and dependencies.
I have one rack of systems with 72C/144T and quad Titan X (Maxwell). Most of the jobs we run on those systems have phases where GPU would be a negative and all the CPU cores that you can throw at it are used. Other phases of a job are 100% GPU and the CPUs are mostly idle. (I love when "htop" shows 144 bar graphs for core activity, and all are at 100%.)
To get these jobs done quickly, huge numbers of both CPU cores and CUDA cores are the trick. (I have GTX 1080Ti in shipment now to upgrade the Titans.)
There is no "one size fits all" in GPU work. The problem space is much larger than any one of us is familiar with.
GPUs offer a staggering amount of processing power. CPU performance is still measured in the hundreds of GFLOPs, while GPUs have passed the 10 TFLOPs mark. So, you're looking at something in the ballpark of 10-30x more processing power on the GPU side of things. CPUs will never be able to compete at any parallel processing task.
Just from Journo's duty...
Supposedly leaked Vega 10 Time Spy score. Look at graphics score.
GTX 1080 Ti.
Take with grain of salt, but supposedly they are leaks. Previous leak, that Manuel linked, was supposedly about small Vega performance.
And you still have to bare in mind that there are features that are not used in Vega architecture, that are key points in performance advancement: FP16, and Primitive Shaders. They require rewriting the application to use them(that is why SiSoft sandra is reporting 1:1 FP16 ratio, rather than 2x vs. FP32).
Lets wait and see what will happen soon. P.S. It appears that this is exact reason why Nvidia is rushing Volta release.
13 TFLOPs.Thats promising. With a core clock of 1600 Mhz that puts it above 12 TFLOPS, right?
P.S. It appears that this is exact reason why Nvidia is rushing Volta release.
Volta was slated Q4 2017 for HPC, and Q1-Q2 2018 for consumer, but might come faster than that.Source? You can't "rush" a GPU release by 6-12 months, i.e. if Volta is coming in Q3 2017 then they've been working towards that date for a long time now.
And besides that, this just shows that Vega is roughly equal to the GTX 1080 Ti. If thats the case its still somewhat of a win for Nvidia because GP102 is most likely cheaper to manufacture than Vega 10 due to its smaller die size and more conventional memory. Not to mention its been out for 9 months.Volta was slated Q4 2017 for HPC, and Q1-Q2 2018 for consumer, but might come faster than that.
No. Previous benchmarks show that Vega with 687F:C1 is faster than GTX 1080 in Games(Doom Vulkan, Star Wars Battlefront). According to the Time Spy benchmark, that Manuel posted it is on par with GTX 1070. This GPU is different deviceID. 687F:C3, to be precise.And besides that, this just shows that Vega is roughly equal to the GTX 1080 Ti. If thats the case its still somewhat of a win for Nvidia because GP102 is most likely cheaper to manufacture than Vega 10 due to its smaller die size and more conventional memory. Not to mention its been out for 9 months.
Volta was slated Q4 2017 for HPC, and Q1-Q2 2018 for consumer, but might come faster than that.
I will not go deeper on this topic.Did NVIDIA announce those dates? Or are you just quoting internet rumors?
As always... That's a cop out and you know it.I will not go deeper on this topic.
So far I have not seen any release date for either Vega and Volta.Never believe internet rumors for release dates. Remember when Vega was going to be released fall 2016?
That doesn't seem to stop you from saying that you know the release date...So far I have not seen any release date for either Vega and Volta.
Have you?
Release date Q4 is different from release date 12.01. 2016.
Then maybe you shouldn't trust rumors and quote them like they are fact.Every company is going with release "spans". And they can still shift. Because of reasons.
Then maybe you shouldn't trust rumors and quote them like they are fact.
Right, and Intel is also not coming up sooner because of Ryzen/Naples.
We're all believers.
There might be no actual "evidence" of the rush but do we believe if any of the involved parties could milk a design a bit further they would just launch something else ahead of time?
I'm counting on the usual suspects to refute with their infinite wisdom of course, we know how it goes. Maybe keep OEMs happy, since rebranding wouldn't cut it? Or maybe some other excuse? Sorry, reason.
Don't really want to pick a(nother) fight here with the usual people but come on, there's always a reason behind moving a schedule, in either direction.
I won't discuss this anymore, I foresee another long exchange of nasty posts.