Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Jul 4, 2015
4,487
2,551
Paris
@ SoyCapitan: I have a mid-2010 12-core cheesegrater. It has an MVC GTX980Ti and Nvidia web drivers, a boot SSD and two HDs. I run all sort of processor and GPU-intensive apps for CG and this machine is fast and stable running under High Sierra.

I too am looking forward to seeing what the new Mac Pro will do/be, but hoping (probably unwisely) for Nvidia support out of the box.

Thanks, I was the main Nvidia in cMP benchmarker on this board 3-4 years ago. Buy the cMP has outlived its usefulness and having to maintain it with unorthodox methods is a pain. The CPUs are too old and and lacking in features for any modern use even with 12 cores, which are rarely utilised efficiently.

The biggest shock we saw was when I posted Premiere benchmarks 3 years ago and we saw that on the SAME cMP, same video, same project, and same version of Premiere that macOS was 4 times slower at rendering than Windows.

So not only does Nvidia lack 10 bit output which is essential for most creative workstation uses, the OS and the apps only work at their best if you have the latest CPUs.
 
  • Like
Reactions: orph

Fangio

macrumors 6502
Jan 25, 2011
375
473
10717
Well I agree to the first comment at YT ;)

I'd like to see the cable from the display attached to the actual GPU. Also I'd like to see someone hold down the Option key to get to the Boot Manager. There's also no shot of the About This Mac; this could be attached to anything. Real proof, please.
 
  • Like
Reactions: abdyfranco

abdyfranco

macrumors regular
Dec 4, 2017
127
121
Well I agree to the first comment at YT ;)

Especially, when they ask to show more, the poster of that video asked to email them. Like, it's too precious/secret/important or something to share to the rest of the world....
Yes, It's suspicious with the secrecy they treat it and the lack of more evidence... until someone here in the forum acquires an RTX 2080 and confirms that it's real, for me it's fake.
 

Dr. Stealth

macrumors 6502a
Sep 14, 2004
814
740
SoCal-Surf City USA
Yes the cMP is on it's way out and I do CADD design and a lot of CUDA rendering. My trusty mid-2010 5,1 cMP was getting too long in the tooth for my needs, even though it was maxed out. It was time to let my old friend go out to pasture. But, I'm not ready to stop using macOS quite yet.

So this was my solution:

I built a bad-ass PC workstation for my daily CADD and CUDA rendering work. It took the place of my cMP on my desk =(. Then, I had a pretty much unused Late-2014 i7 mac-mini that I brought back into service and placed it right next to the PC workstation. I picked up a very nice ATEN Display-Port KVM switch so I can use both computers with the same 32" 4k Dell display, keyboard, mouse and various USB devices.

This actually worked out much better than I could have ever imagined. I jump on my PC and load a bunch of renders in my queue and hit the start button. This totally maxes out the CPU's and GPU's to 98-100% utilization.

I can hit one hot-key on my keyboard and toggle between the two machines at will. So once I load my PC up with renders I switch to the mini and do my light daily work like web browsing, emails generic stuff in macOS with no load.

It doesn't have to be this computer VS that computer. We have wonderful technology available to to let you have it all.

Anyway... Just my 2 cents...
 
Last edited:

eksu

macrumors 6502
Aug 3, 2017
329
151
that's the point and you answered it with your second post. If you need a program with Cuda you'd be better off just buying a new computer all of which runs cicrles around Mac Pros


The Classic Mac Pro with an RTX card would be perfect to work with Swift for Tensorflow (those tensor cores).

I just hope it becomes more stable to use in Playgrounds and that Nvidia can get HEVC decode, encode, and 10 bit output (basically Vega feature parity on MacOS).
 

teagls

macrumors regular
May 16, 2013
202
101
HIP and ROCm surely will change that.

Doubtful. CUDA is too far intrenched. Especially in deep learning. No data scientist, academic, ml engineer wants to fool around with translating code over to work on AMD. There is only so much time in the day and when there is pressure to push deadlines and deliver the software needs to work. NVIDIA & CUDA provides that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.