Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

shuto

macrumors regular
Oct 5, 2016
195
110
For my money Arnold is the best renderer out there for features, stability and C4D integration, but you do take a speed hit for its superior feature set. But that might change as its GPU support matures

Unfortunately GPU Arnold is going to be PC only RTX nvidia though :(

It’s still really hard to work out if Mac is a viable platform for 3D GPU rendering work until benchmarks out. Can’t wait for benchmarks!!!

I think I’ve decided my setup will be PC + 16” MacBookPro though.

Still would love it if Mac Pro is amazing for GPU work though!
 

AndreeOnline

macrumors 6502a
Aug 15, 2014
704
495
Zürich
Still would love it if Mac Pro is amazing for GPU work though!

I'm mostly at home in Cinema 4D and Arnold/Maxwell. But surely there is someone in the Blender community with some programming chops who is looking at the Mac Pro with dual Pro Vega II Duo cards and imagines that working with Blender...

Blender is making huge strides every day and while it's most likely more of a mid/long term perspective, I'm thinking the right move might actually be to ease into Blender. There's plenty of learning material.
 

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
Unfortunately GPU Arnold is going to be PC only RTX nvidia though :(

It’s still really hard to work out if Mac is a viable platform for 3D GPU rendering work until benchmarks out. Can’t wait for benchmarks!!!

I think I’ve decided my setup will be PC + 16” MacBookPro though.

Still would love it if Mac Pro is amazing for GPU work though!

Oof, I didn't realise that about Arnold. That is extremely unfortunate. I used Arnold for a job today and really enjoyed its features and look,
[automerge]1580239409[/automerge]
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Unfortunately GPU Arnold is going to be PC only RTX nvidia though :(


may want to inform the company of that. Release Note says that it isn't RTX only. Nor is it Windows PC only.


From the "list of supported GPUs" there the last 3 generations of Nvidia cards are there. So is Linux.
They "recommend" RTX , but that isn't an 'only' constraint. [ I also recall there being some issues/problems if mix RTX and older cards for multiple GPU. ].

They are more than eyeball deep in Nvidia's OptiX library though.

Can stuff 10GB into the OptiX cache so that will pull more work off of CPUs over time. Apple does have lots of work to do to enable alternative competitive solutions to appear.
 

hifimac

macrumors member
Mar 28, 2013
64
40
The Redshift Trello says “Assisting Apple with bringing Redshift to Metal” so looks like Apple is pretty involved in this port. Hopefully that means some good performance.
 

hifimac

macrumors member
Mar 28, 2013
64
40
Looks like Octane is now saying they are waiting on Metal support coming in 10.15.5, which is a ways off from release.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Looks like Octane is now saying they are waiting on Metal support coming in 10.15.5, which is a ways off from release.

Bit of a bummer but it’s always good to see Apple putting in a lot of improvements for Metal. It’s nice Octane is helping them toughen it up.
 

awkward_eagle

macrumors member
Feb 5, 2020
84
36
Unfortunately GPU Arnold is going to be PC only RTX nvidia though :(

I use Arnold for Cinema 4D pretty heavily and can say that while it's GPU speed is faster than most mid-level Intel chips, the time it takes to send an average scene to the GPU before it can render kills the experience. If you're doing Arnold rendering, you'd be better off going with a high core count cpu like a Threadripper / Epyc or Xeons if you want to burn money. Can say from experience that its faster and more stable on a good CPU. Of course this could all change with future versions of GPUs and Arnold updates, but even now that it's out of beta it's nothing to write home about if you do serious production.

edit : Still prefer running C4D and Arnold on a Mac despite the steep cost of Xeons. Been using it on a Windows system more powerful than the top end MP and the Windows implementation of OpenGL coupled with nVidia's drivers ( over the last few years ) has made C4D's viewport faster but more unreliable than what's on the Mac, particularly with crashes to the IPR with heavy scenes.
 
Last edited:
  • Like
Reactions: sirio76

VJNeumann

macrumors member
Jul 26, 2017
46
73
edit : Still prefer running C4D and Arnold on a Mac despite the steep cost of Xeons. Been using it on a Windows system more powerful than the top end MP and the Windows implementation of OpenGL coupled with nVidia's drivers ( over the last few years ) has made C4D's viewport faster but more unreliable than what's on the Mac, particularly with crashes to the IPR with heavy scenes.

I know there's no one metric that can be used to compare the two, but how much slower would you say the average render time is for you, shifting to Mac (I'm assuming MP 7,1?) from a Windows/nVidia machine?

Personally I enjoy dabbling in 3D, but working out which configuration I should focus on is like shooting multiple moving targets from a moving platform. On the one hand I'm deciding whether to stick with Maya or C4D, then there's the renderer question (Arnold, or the upcoming OSX Redshift/Octane), and then the perennial question of whether to get it working on a Mac or just go with the tried-and-true glorified gaming PC route (Threadripper, RTX).

It's so hard to compare with all those variables in play. Like, to what extent will a GPU renderer be wounded by Mac's hardware selection, etc...
 

lilkwarrior

macrumors 6502
Jul 9, 2017
402
263
San Francsico, CA
Personally waiting for W5700X to become available and/or pricing to be released before I further analyze those decisions. One thing is clear - I'm not interested in purchasing another RX580-based GPU in 2020.

The Sapphire Pulse version of RX 5700 XT is a very quiet GPU in all testing so far, even in an eGPU enclosure. Sapphire Pulse RX 580 is/was quiet in MP5,1 a lot of the time but did kick up when being used. Also has a BIOS switch for true silent mode.
I'm on the same boat as you but I've since have had my eyes on Big Navi w/ ray-tracing confirmed for 2020, no MPX module for 'em be damned.

Slightly concerned how much louder a non-MPX machine has considering how loud my 2 x RTX 2080TI /Quadro rig gets using the direct-from-Nvidia model.
 

awkward_eagle

macrumors member
Feb 5, 2020
84
36
Personally I enjoy dabbling in 3D, but working out which configuration I should focus on is like shooting multiple moving targets from a moving platform. On the one hand I'm deciding whether to stick with Maya or C4D, then there's the renderer question (Arnold, or the upcoming OSX Redshift/Octane), and then the perennial question of whether to get it working on a Mac or just go with the tried-and-true glorified gaming PC route (Threadripper, RTX).

I use a render farm running Arnold as one of it's options for my job, so using a Mac as my workstation is a better option despite the Xeon's price/performance compared to Threadripper. Redshift is a good renderer, but I prefer the look of Arnold. It's been way more stable for me across the board when compared to Redshift.

If I were doing it at home just for fun, I'd still go Mac Pro. Windows + nVidia has been nothing but an endless frustration in drivers and hiccups over the course of 3 separate WIndows system builds. I find it hard to be professionally creative when I'm constantly fighting my computer.

If I were doing paid jobs at home, I'd go Mac Pro for workstation and build one or more threadripper systems to use as render boxes for Cinema 4Ds Team Render.

edit : Average render time for a 4K UHD frame in Arnold is about an hour, but they're pretty dense scenes. Redshift is definitely faster but is less stable and takes more work to get looking "photoreal" if thats what you're goal is.
 
  • Like
Reactions: vel0city

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Looks like Octane is now saying they are waiting on Metal support coming in 10.15.5, which is a ways off from release.

At current pace that should put it near May-June. Which will be close to a year since they and Apple said they were going to do something here.
[automerge]1581022308[/automerge]
Bit of a bummer but it’s always good to see Apple putting in a lot of improvements for Metal. It’s nice Octane is helping them toughen it up.

Looks more like Metal really wasn't ready to be a replacement for OpenCL at all. And pretty close to another "dog ate my homework" event from Apple in June. it is better than them being in Rip van Winkle mode, but a year waiting on system library functionality says wasn't ready for prime time in the first place.
 

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
I agree with everything @awkward_eagle posted about Arnold, there is no "best" renderer, but Arnold is the best renderer. For lookdev/styleframes or stills I tend to go Redshift, but for anything big I know that Arnold is dependable, scalable and will look amazing. It's a mature and stable renderer and out of all the third party renderers, I find it has the closest integration with C4D and its plugins.

I don't have a renderfarm though so all my final renders get farmed to PixelPlow. Great workflow that I really enjoy.
 
  • Like
Reactions: hifimac

hifimac

macrumors member
Mar 28, 2013
64
40
Thanks for all your input and it's good to hear from some seasoned 3D veterans who are not just PC enthusiasts. I've got a big multi-month 3D gig coming in a few weeks and have been stressing about if I should jump ship to a PC. I'm a longtime Mac user and would love to stay on the platform, but the new Mac Pro pricing is really hard to swallow, especially without Metal GPU benchmarks. I've been testing Octane, Redshift and Corona with an old 980ti on my 5,1. Octane has been completely unstable. Redshift is fast, but has been hard to dial in. I've gotten some good looks out of Corona, but it's been pretty buggy and rather slow, and v5 does not work with R21 on macOS. I guess I need to take another look at Arnold. Good to hear I'm not really missing out on anything with its GPU implementation. Any recommended training for Arnold beyond GSG?
 

awkward_eagle

macrumors member
Feb 5, 2020
84
36
Thanks for all your input and it's good to hear from some seasoned 3D veterans who are not just PC enthusiasts. I've got a big multi-month 3D gig coming in a few weeks and have been stressing about if I should jump ship to a PC. I'm a longtime Mac user and would love to stay on the platform, but the new Mac Pro pricing is really hard to swallow, especially without Metal GPU benchmarks. I've been testing Octane, Redshift and Corona with an old 980ti on my 5,1. Octane has been completely unstable. Redshift is fast, but has been hard to dial in. I've gotten some good looks out of Corona, but it's been pretty buggy and rather slow, and v5 does not work with R21 on macOS. I guess I need to take another look at Arnold. Good to hear I'm not really missing out on anything with its GPU implementation. Any recommended training for Arnold beyond GSG?

GSG has great training. Also do a search on lesterbanks.com to find a lot more stuff that gets into specifics like rendering skin and caustics. Anything you find for Arnold in Maya almost exactly translates to the C4D version.

Arnold v5 works for me in r21, but had to move to a newer build to get it working. v6 also works. As vel0city said, Arnold scales really well on account of it being originally created as a production renderer for large facilities as opposed to Redshift and Octane being more of a single user hobbyist tool. THose are great renderers, but can easily run into instability or other limitations depending on how complex your project is. It works great on commercial render farms like PixelPlow and even C4Ds team render.
 

VJNeumann

macrumors member
Jul 26, 2017
46
73
I use a render farm running Arnold as one of it's options for my job, so using a Mac as my workstation is a better option despite the Xeon's price/performance compared to Threadripper. Redshift is a good renderer, but I prefer the look of Arnold. It's been way more stable for me across the board when compared to Redshift.

If I were doing it at home just for fun, I'd still go Mac Pro. Windows + nVidia has been nothing but an endless frustration in drivers and hiccups over the course of 3 separate WIndows system builds. I find it hard to be professionally creative when I'm constantly fighting my computer.

If I were doing paid jobs at home, I'd go Mac Pro for workstation and build one or more threadripper systems to use as render boxes for Cinema 4Ds Team Render.

Thanks, some good insights here. Do you think upcoming Navi with RTX will shake things up at all? It seems to me GPU-heavy rendering (at least in workstations, not render farms) is where the industry is heading, so I guess I'm wondering if I should see what AMD has in store this year.

I agree with everything @awkward_eagle posted about Arnold, there is no "best" renderer, but Arnold is the best renderer.

In the past that's been my experience. Does anyone suggest using Maya these days on OSX, especially with Arnold? Or is there just no upside versus C4D? I used to prefer Maya's toolset, but found its OSX versions buggy and unstable. May be different nowadays though, it's been a while since I tried it.
 

awkward_eagle

macrumors member
Feb 5, 2020
84
36
Do you think upcoming Navi with RTX will shake things up at all?

No one knows how good Navi will be, but the introduction of RTX on nVidia hasn't exactly taken the world of post production by storm. RED uses RTX cores for .r3d decode so you don't need to buy their Rocket card ( RED's version of the Afterburner ) and the GPU implementation of Arnold uses nvidia's optix shaders, but like I mentioned above, it's not as good as you'd think when comparing it to a high core count Xeons or Threadrippers. My guess is it'll be useful when Apple implements raytracing into Metal ( they've given demos ) using dedicated cores on the gpu. Part of the reason I prefer the Mac for production is that when Apple implements new tech into the system, it usually has a more wholistic effect than when random developers have to do a lot of leg work on the windows / nvidia side, making everything really inconsistent. Since Apple is dumping OpenCL and making everyone use Metal, every update to Metal and every new piece of hardware that utilizes it sees an overall increase in performance without having to do much on the app developer side.

Does anyone suggest using Maya these days on OSX, especially with Arnold? Or is there just no upside versus C4D? I

Maya on Windows is pretty good these days, but the MacOS port still isn't quite there. THis is generally true of all Autodesk apps. Maxon has always put a lot of focus on making Cinema 4D run well on a Mac. This is especially true with the release of r21. Between that and Arnold running great on both platforms, the C4D + Arnold combo is my personal preference. I generally like the UI and overall C4D architecture over Maya, even on Windows.
 

VJNeumann

macrumors member
Jul 26, 2017
46
73
No one knows how good Navi will be, but the introduction of RTX on nVidia hasn't exactly taken the world of post production by storm. RED uses RTX cores for .r3d decode so you don't need to buy their Rocket card ( RED's version of the Afterburner ) and the GPU implementation of Arnold uses nvidia's optix shaders, but like I mentioned above, it's not as good as you'd think when comparing it to a high core count Xeons or Threadrippers.

Very true. And I'd be curious to see benchmarks of, say, Vega II Duo versus dual 2080ti on something like Redshift or another GPU renderer. I'm not aware of any, and I'm always curious how well workstation-class cards actually stack up against consumer cards these days.

Also, I've heard it said that the Afterburner card can theoretically be leveraged by any third party as an accelerator for different workflows than ProRes. I wonder if any companies will ever take advantage of that, or if it will remain too niche a product.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
No one knows how good Navi will be, but the introduction of RTX on nVidia hasn't exactly taken the world of post production by storm. RED uses RTX cores for .r3d decode so you don't need to buy their Rocket card ( RED's version of the Afterburner ) and the GPU implementation of Arnold uses nvidia's optix shaders, but like I mentioned above, it's not as good as you'd think when comparing it to a high core count Xeons or Threadrippers. My guess is it'll be useful when Apple implements raytracing into Metal ( they've given demos )

Raytracing versus debayering and/or decompressing camera sensor data are both computationally intensive and different at the same time. The former involves data from many different places in the "frame". The latter is far more restricted to immediately surrounding "frame" data elements only. There may be some computational loops can stuff into a Tensor matrix multiply context that are shared in common but there will be as many different as the same elements.

Apple looks to be expanding Metal just enough to cover certain corner cases as opposed to a general computation compute platform. Whether red's task fits inside the same bucket as tight border they draw around raytracing remains to be seen. ( and if that is being held up by AMD beta hardware is even more "thin ice" . Because of both what the specialized scoped of AMD RT support will be as narrow as Nvidia's or that the tensor specialization focus will similar either. Or if AMD went with more general compute function units with some 'better' local data passing connections. ). Also Metal's raytracing is likely skewed to whatever Apple is going to do with their own GPU (and tensor core) implementations. The chance hat both AMD and Apple's GPU basically clone what Nvidia did is slim.


RED's CUDA code for .r3d decode works on non RTX GPU cards. Nvidia likes trumpeting about the more expensive RTX GPU results on maximal 8K footage because that puts more money in their pockets.
But in 4-6K and lower range there are more options than RTX.
 

awkward_eagle

macrumors member
Feb 5, 2020
84
36
RED's CUDA code for .r3d decode works on non RTX GPU cards. Nvidia likes trumpeting about the more expensive RTX GPU results on maximal 8K footage because that puts more money in their pockets.
But in 4-6K and lower range there are more options than RTX.

It reminds me of when the Titan Voltas were released with the dedicated machine learning Tensor cores. It sped up a lot of stuff in Davinci Resolve that had nothing to do with machine learning because it just had more raw compute. Outside of gaming, I see RTX the same way. The same would apply to Navi, raytracing or not, it'll be faster and everything will benefit. But when it comes to price/performance ratio, who knows. Almost certainly will be a cheap "gaming" card that'll work in the Mac Pro along with an MPX module with double the ram for a lot more money and better support.
 

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
Just an FYI, the RED ROCKET FPGA cards have been discontinued. I keep seeing mentions of them on this forum and others when discussing Afterburner and GPU acceleration.

It is still possible to purchase them, but support moving forward is limited at best. At $7K that's not a viable longterm investment.

"GPUs have now outpaced graphics acceleration cards in the technology for processing REDCODE files. RED recommends using an NVIDIA GPU to support your post production needs."

 

awkward_eagle

macrumors member
Feb 5, 2020
84
36
Just an FYI, the RED ROCKET FPGA cards have been discontinued. I keep seeing mentions of them on this forum and others when discussing Afterburner and GPU acceleratio

I think RED officially discontinued the Rocket with the release of RTX support. They've also just released their SDK for Metal support so it's up to Apple and Blackmagic to update their apps.

The comparisons between Afterburner and Rocket I think are just because they're both FPGAs that do roughly the same thing for different codecs. Its funny that Apple and RED came solved the same problem in opposite directions.

Apple : Do everything on GPU and CPU but eventually release FPGA.

RED : Nothing can play .r3D ( in the early days ) so release FPGA but kill it when GPUs were good enough.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Apple : Do everything on GPU and CPU but eventually release FPGA.

RED : Nothing can play .r3D ( in the early days ) so release FPGA but kill it when GPUs were good enough.

A dedicated FPGA is probably always going to be more efficient. It also leaves the GPU free for other things, like rendering. Considerable GPU performance can be lost when the GPU has to host multiple tasks. That's why Apple has focused so much on dedicated GPUs for dedicated tasks (like the compute/render split in the 2013 Mac Pro.)

I see RED's move as a reflection of their scale. It's probably burdensome for them to produce their own FPGA. It's easier for them to offload that to the GPU, even if it's less efficient.

But not having a FPGA taking the load will hurt rendering performance if you're working directly with RED footage.
 

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
Thanks for all your input and it's good to hear from some seasoned 3D veterans who are not just PC enthusiasts. I've got a big multi-month 3D gig coming in a few weeks and have been stressing about if I should jump ship to a PC. I'm a longtime Mac user and would love to stay on the platform, but the new Mac Pro pricing is really hard to swallow, especially without Metal GPU benchmarks. I've been testing Octane, Redshift and Corona with an old 980ti on my 5,1. Octane has been completely unstable. Redshift is fast, but has been hard to dial in. I've gotten some good looks out of Corona, but it's been pretty buggy and rather slow, and v5 does not work with R21 on macOS. I guess I need to take another look at Arnold. Good to hear I'm not really missing out on anything with its GPU implementation. Any recommended training for Arnold beyond GSG?

It's worth getting a GSG Plus sub, as Chad Ashley has just released a massive Arnold training course, he's a great tutor and goes into deep detail. It doesn't really take that long to learn the basics and core concepts of Arnold. Redshift and Octane will be second nature once you're familiar with Arnold. Feel free to ask on here if you need any pointers or help.

Couple of good links:

https://arnold-rendering.com/ Lee Grigg's blog, he does a lot of weird and cool arty experiments with Arnold and posts links to other Arnold places of interest.

https://answers.arnoldrenderer.com/spaces/13/c4dtoa.html Arnold support Q&A forum

And just to depress you, I saw this today, created in ZBrush and Substance, rendered in Arnold: https://www.zbrushcentral.com/t/checkmate/358529
 

hifimac

macrumors member
Mar 28, 2013
64
40
It's worth getting a GSG Plus sub, as Chad Ashley has just released a massive Arnold training course, he's a great tutor and goes into deep detail. It doesn't really take that long to learn the basics and core concepts of Arnold. Redshift and Octane will be second nature once you're familiar with Arnold. Feel free to ask on here if you need any pointers or help.

Couple of good links:

https://arnold-rendering.com/ Lee Grigg's blog, he does a lot of weird and cool arty experiments with Arnold and posts links to other Arnold places of interest.

https://answers.arnoldrenderer.com/spaces/13/c4dtoa.html Arnold support Q&A forum

And just to depress you, I saw this today, created in ZBrush and Substance, rendered in Arnold: https://www.zbrushcentral.com/t/checkmate/358529

Thanks! The Arnold IPR is not quite as fast as Redshift, but it's good enough on my 10 core iMac Pro, and I have two 5,1's I can throw at some CPU renders when needed. Went ahead and signed up for GSG+ and taking a deep dive into Arnold. Seems really stable and less fidgety than Redshift.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.