Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

libertyranger10

macrumors regular
Original poster
Jun 10, 2011
130
16
Hello, I was hoping if anyone has the new iMac Pro (8 core preferably, but any model will do) would be able to download the Demo Version of Neat Image and run the performance test?

It's in the performance tab of the preferences.

I've uploaded some screenshots of it. You have the option of running the performance test with the CPU and GPU. I use this program quite frequently and was curious as to how it would perform vs. my 2014 MacBook Pro. Thanks!
 

Attachments

  • Screen Shot 2018-01-29 at 3.05.58 PM.png
    Screen Shot 2018-01-29 at 3.05.58 PM.png
    296.1 KB · Views: 282
  • Screen Shot 2018-01-29 at 3.05.46 PM.png
    Screen Shot 2018-01-29 at 3.05.46 PM.png
    313.7 KB · Views: 267
This is cool! I didn't even know Neat image existed. i only knew of Neat Video, which is a great plugin for FCPX and other NLEs​
[doublepost=1517258125][/doublepost]I don't have an iMac Pro, but I have an iMac 5k from 2014 that I'll post results from to give you some context. First off, it doesn't look like hyperthreaded cores help much. And second, it seems like something is wrong in your pictures, since it only shows 31MB of available GPU memory.
[doublepost=1517258159][/doublepost]Screen Shot 2018-01-29 at 9.34.04 PM.png
 
Probably depends on whether neat-image can use half precision floats...

here's mine--- not an imac pro, not by a long shot.

Screen Shot 155.png
 
Thanks for the heads up! Just redid it and it shows more memory now. Looks like your desktop GPU is much more powerful than the laptop one I have.
 

Attachments

  • Screen Shot 2018-01-29 at 3.51.37 PM.png
    Screen Shot 2018-01-29 at 3.51.37 PM.png
    329.4 KB · Views: 248
Hello, I was hoping if anyone has the new iMac Pro (8 core preferably, but any model will do) would be able to download the Demo Version of Neat Image and run the performance test?

It's in the performance tab of the preferences.

Here you go - run on 8-core iMP, with 32GB and Vega 56.
 

Attachments

  • Screen Shot 2018-01-29 at 2.03.16 PM.png
    Screen Shot 2018-01-29 at 2.03.16 PM.png
    824.6 KB · Views: 264
  • Screen Shot 2018-01-29 at 2.03.28 PM.png
    Screen Shot 2018-01-29 at 2.03.28 PM.png
    397.5 KB · Views: 239
  • Like
Reactions: jerwin
It seems this software run much better on Nvidia GPU. This result is from my Mac Pro (spec as per signature).
Screen Shot 2018-01-30 at 14.36.43.jpg
Screen Shot 2018-01-30 at 14.32.26.jpg
 
Last edited:
Thanks for the heads up! Just redid it and it shows more memory now. Looks like your desktop GPU is much more powerful than the laptop one I have.

Technically the GPUs in the iMacs that this post refers to, i.e. the R9 M290X and M295X, are both mobile GPUs. They are however, insanely high TDP mobile chips that never actually could go into a mobile product, aside from a Desktop Replacement like the ones Clevo make.

Here you go - run on 8-core iMP, with 32GB and Vega 56.

Ya - I thought that was odd! I ran it again with the same result, though, so it must be an implementation detail of how Neat Image works in this particular configuration.

I would assume the CPU slowing it down is a result of how much faster the GPU is at the relevant tasks. Assuming the GPU and CPU would have to share data to perform the correct operations, it could simply be that the time it takes the GPU to share its results with the CPU is slower than the time it takes the GPU to just work on the results on its own.
 
Technically the GPUs in the iMacs that this post refers to, i.e. the R9 M290X and M295X, are both mobile GPUs. They are however, insanely high TDP mobile chips that never actually could go into a mobile product, aside from a Desktop Replacement like the ones Clevo make.





I would assume the CPU slowing it down is a result of how much faster the GPU is at the relevant tasks. Assuming the GPU and CPU would have to share data to perform the correct operations, it could simply be that the time it takes the GPU to share its results with the CPU is slower than the time it takes the GPU to just work on the results on its own.

Looks like that's why the 1080ti results in the Mac Pro listed above are so much higher than the CPU+GPU.
 
  • Like
Reactions: h9826790
Looks like that's why the 1080ti results in the Mac Pro listed above are so much higher than the CPU+GPU.


Nvidia GPUs and AMD GPUs don't work the same way. The way Neat video is written could perform better on Nvidia's architecture

Tha 1080 is substantially faster than an R9 M295X, but there are many things to consider in cases like this. The fact the 1080 scores more than 10 MP more than the Vega points to Nvidia cards being favoured though
 
  • Like
Reactions: h9826790
So that would explain why the 1080 did 50% better on it's own than the Vega? Or are just the 1080ti's that much better? or perhaps a little of both?
 
So that would explain why the 1080 did 50% better on it's own than the Vega? Or are just the 1080ti's that much better? or perhaps a little of both?


Different tasks make different cards shine. This is why cryptocurrency miners always seem to favour AMD. Whilst some Nvidia cards do well at mining too, it's generally AMD cards that do better

In general, the Vega 56 is faster than the 1070, but not as fast as the 1080. The Vega 64 should be more ala the 1080, however, I doubt it'll be in the case of Neat video, since the difference really shouldn't be anywhere near this big if it's raw performance alone, and not the specific workload that is taken into account.
 
  • Like
Reactions: h9826790
Different tasks make different cards shine. This is why cryptocurrency miners always seem to favour AMD. Whilst some Nvidia cards do well at mining too, it's generally AMD cards that do better

In general, the Vega 56 is faster than the 1070, but not as fast as the 1080. The Vega 64 should be more ala the 1080, however, I doubt it'll be in the case of Neat video, since the difference really shouldn't be anywhere near this big if it's raw performance alone, and not the specific workload that is taken into account.

Awesome! Thank you for your insight! Going from my 2014 MacBook Pro to the iMac, it looks like I’ll be getting around a 4 to5 Times speed boost on these numbers!
 
Awesome! Thank you for your insight! Going from my 2014 MacBook Pro to the iMac, it looks like I’ll be getting around a 4 to5 Times speed boost on these numbers!

If the benchmarks can really reflect this software's real world performance. Then this software should be highly optimised to Nvidia GPU. And even you get an iMac, you may still consider if Nvidia eGPU is a possible option to speed up your work flow.

If it's possible to setup a Nvidia eGPU correctly. An iMac with a powerful Nvidia eGPU should be more cost effective than an iMac Pro (in this particular case)
 
Also fair to point out that that isn’t just a 1080 but a 1080ti which is (by a non-trivial amount) more powerful than the 1080 itself, and one of the most powerful consumer cards you can buy.
 
Last edited:
If the benchmarks can really reflect this software's real world performance. Then this software should be highly optimised to Nvidia GPU. And even you get an iMac, you may still consider if Nvidia eGPU is a possible option to speed up your work flow.

If it's possible to setup a Nvidia eGPU correctly. An iMac with a powerful Nvidia eGPU should be more cost effective than an iMac Pro (in this particular case)

That's great advice, thank you! I'm hoping Apple releases 6 core iMac's this year considering Intel has them now with the i7 8700k. Will be interested to see how these stack up to the new iMac Pros. An external GPU would be a good fit!
 
  • Like
Reactions: h9826790
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.