Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

top

macrumors member
Original poster
Dec 4, 2011
46
0
Hey guys,

I run a video production/advertising company and we currently use 2 iMacs one for accounts/clients/pre production while the other is animation/video edition/3d modeling/graphics. The work horse computer is a i7 3.4ghz with 32 GB of RAM, AMD Radeon HD 6970M 2GB of ram, SSD for all the applications and a pegasus 6 bay raid system using thunderbolt. We have the computers networked together using a time capsule.

I find myself exporting huge 3d animations mixed with HD video and soaks up a good 1-3 hours of my day rendering when we near the end of a project. We use Adobe applications for pretty much everything. My main question is what could I do to speed up render times? Anyway of buying a old Mac Pro and render farming with it? Let me know what your thoughts are!
 
picture of the office setup
 

Attachments

  • office.jpg
    office.jpg
    564.8 KB · Views: 1,056
short answer: yes, it's possible.

Depending on how "big" you want to go, you may need to work out licensing issues and does get expensive quickly for the hardware. Figure out a budget first, then work backwards.

Other forums may be better for the specifics, like Creative Cow (FYI).
 
  • My main question is what could I do to speed up render times?
  • Anyway of buying a old Mac Pro and render farming with it?

  • In your current system you need to profile the components and compare.
    • What kinds of speeds are you getting with that video card?
    • What do other cards on other people's systems profile like?
    • Does your video card accelerate anything to do with the export process?
    • What I/O speeds are you getting from your current RAID?
    • What are the theoretical limits of the RAID system you're using?
    • How much I/O bandwidth is the export process consuming?
    If all those things are good and in the pink and you're still not satisfied with the overall performance then it's time to answer B.

  • http://tv.adobe.com/watch/cs6-creat...ful-background-rendering-using-media-encoder/



    And remember that many older machines are cheaper and faster (or about the same) than fewer new machines.

    For example a MacPro2,1 with 8 cores of 3.0GHz can be found for $400 to $500 and three of them (after adding RAM to all) will be faster and cheaper than a new-ish $2,000 base machine with the same amount of RAM.
 
Last edited:
... .I run a video production/advertising company and we currently use 2 iMacs one for accounts/clients/pre production while the other is animation/video edition/3d modeling/graphics. The work horse computer is a i7 3.4ghz with 32 GB of RAM, AMD Radeon HD 6970M 2GB of ram, ... .
I find myself exporting huge 3d animations mixed with HD video and soaks up a good 1-3 hours of my day rendering when we near the end of a project. We use Adobe applications for pretty much everything. My main question is what could I do to speed up render times? ... .Let me know what your thoughts are!
Add CUDA compute cores by way of GTX 680s or GTX Titans and increase render performance by, at least, 5X to 10X per GPGPU [See pgs. 10-14 here: http://www.nvidia.com/docs/IO/123576/nv-applications-catalog-lowres.pdf ].

P.S. Take a look at the site for Octane Render for 3d chores [ http://render.otoy.com ], the sample videos here [ http://render.otoy.com/videos.php ], download the demo(s) from here [ http://render.otoy.com/downloads.php ] and read the manual here [ http://render.otoy.com/downloads/OctaneRenderUserManual.pdf ].
 
Last edited:
Hi everyone,

I have a question related to the first one. I have 2 iMacs, a Time Capsule and a Mac Pro Server. How can I network all four of them together.
 
Hi everyone,

I have a question related to the first one. I have 2 iMacs, a Time Capsule and a Mac Pro Server. How can I network all four of them together.

You either use the Time Capsule to create a secure wireless 5 GHz network and connect the Macs to it, if all have a WiFi/AirPort card in them, or you use three Cat5e or Cat6 Ethernet cables to physically connect all Macs to the Time Capsule and have a secure local area network (LAN), for which you can enable File Sharing in System Preferences > Sharing on all Macs individually.
Does than answer your question?

To learn more about Mac OS X: Helpful Information for Any Mac User by GGJstudios
 
Add CUDA compute cores by way of GTX 680s or GTX Titans and increase render performance by, at least, 5X to 10X per GPGPU [See pgs. 10-14 here: http://www.nvidia.com/docs/IO/123576/nv-applications-catalog-lowres.pdf ].

P.S. Take a look at the site for Octane Render for 3d chores [ http://render.otoy.com ], the sample videos here [ http://render.otoy.com/videos.php ], download the demo(s) from here [ http://render.otoy.com/downloads.php ] and read the manual here [ http://render.otoy.com/downloads/OctaneRenderUserManual.pdf ].

580 gtx oc edition are still better than 680 gtx with octane, Any bench with a titan?
 
You either use the Time Capsule to create a secure wireless 5 GHz network and connect the Macs to it, if all have a WiFi/AirPort card in them, or you use three Cat5e or Cat6 Ethernet cables to physically connect all Macs to the Time Capsule and have a secure local area network (LAN), for which you can enable File Sharing in System Preferences > Sharing on all Macs individually.
Does than answer your question?

To learn more about Mac OS X: Helpful Information for Any Mac User by GGJstudios

Thank you very much, it was very helpful; I'll try it and see how it works!
 

Attachments

  • OctaneBenchmarkTrench.JPG
    OctaneBenchmarkTrench.JPG
    55 KB · Views: 241
Here's one. WolfPackAlphaCanisLupus0 rendered this scene in 14 seconds. You can replicate it from the demo scenes file at the bottom of the page here [ http://render.otoy.com/downloads.php ]. For additional information on this CUDA rig, see posts #s 630 - 632 here: [ https://forums.macrumors.com/threads/1333421/ ].

That looks really impressive, as that scene is mostly indirect light and in spite of low details, it cleared a significant amount of noise in that time. In case the OP is unfamiliar with it, it's an unbiased engine. All of that noise has to be cleared through real sampling as opposed to interpolated samples, as it will render environment/background color in unsampled or under-sampled areas, in this case black. I've read the octane is still quite limited in shader types, but I never got back to investigating that further. Is the scene just geometric boxes with a flat shader and an hdri? That's what it looks like to me. Even then such a scene would take a huge amount of time to render as most of that light is coming from indirect bounces.
 
That looks really impressive, as that scene is mostly indirect light and in spite of low details, it cleared a significant amount of noise in that time. In case the OP is unfamiliar with it, it's an unbiased engine. All of that noise has to be cleared through real sampling as opposed to interpolated samples, as it will render environment/background color in unsampled or under-sampled areas, in this case black. I've read the octane is still quite limited in shader types, but I never got back to investigating that further. Is the scene just geometric boxes with a flat shader and an hdri? That's what it looks like to me. Even then such a scene would take a huge amount of time to render as most of that light is coming from indirect bounces.

I been using the demo for the last 4 months to learn the software, despite my having had 4 seats. Your assessment of the scene is correct. Since an octane license gets you access to community shaders, and their numbers are growing constantly, I'd recommend that you give Octane another review.

Here's another render of the same scene after I tweaked the GPUs, quadrupled the sample no. and slightly pulled back the camera, to render the scene in 13 sec.
 

Attachments

  • OctaneBenchmarkTrench3.JPG
    OctaneBenchmarkTrench3.JPG
    84.7 KB · Views: 216
Last edited:
I been using the demo for the last 4 months to learn the software, despite my having had 4 seats. Your assessment of the scene is correct. Since an octane license gets you access to community shaders, and their numbers are growing constantly, I'd recommend that you give Octane another review.

Here's another render of the same scene after I tweaked the GPUs, quadrupled the sample no. and slightly pulled back the camera, to render the scene in 13 sec.

I will eventually check it out again. I am lacking access to an NVidia machine at the moment:mad:. Are people writing their own or just donating materials creating from existing bundled shaders? Some of the gallery pieces on their site look great. It would be cool to see a real scene. It seems incredibly fast, but you're not dealing with any real material complexity, displacement, refraction, or anything else as it's a test scene. It's just bombarding a bunch of boxes with what looks like irradiance based GI:D.

I can't totally tell what it's like looking through the gallery either. I watched some of the videos. It seems to handle transparent shadows well. It's unbiased, so it lacks flicker problems. Some of the bright reflections look a little odd like it's lacking any anti aliasing filter. It's hard to really get a good idea of the GI quality without a bit more scene detail. Some of the crevices are falling off nicely without the use of AO. I want to see some of your renders of real scenes. That would be awesome.
 
I been using the demo for the last 4 months...

How do you get it to recognize the CUDA card? I'm using a single GTX 570 with CUDA driver "cudadriver-5.0.59-macos.dmg" OS X 10.7.5 and i get this:
 

Attachments

  • Screen Shot 2013-05-28 at 8.00.38 PM.png
    Screen Shot 2013-05-28 at 8.00.38 PM.png
    90.1 KB · Views: 155
  • Screen Shot 2013-05-28 at 8.41.12 PM.png
    Screen Shot 2013-05-28 at 8.41.12 PM.png
    81.6 KB · Views: 188
Last edited:
CUDA Driver Trouble Shooting Tips

How do you get it to recognize the CUDA card? I'm using a single GTX 570 with CUDA driver "cudadriver-5.0.59-macos.dmg" OS X 10.7.5 and i get this:

Tesselator,

Here's a pic of what my (Windows, not Mac) preferences window looks like and some tips that I hope helps.


CUDA Driver Trouble Shooting Tips For Mac Octane Users


1. Are you using the correct OS?
From Nvidia CUDA site [ http://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#general-cuda-known-issues ]
Mac OS X Supported Mac Operating Systems
A. Mac OS X 10.8.x
B. Mac OS X 10.7.x


2. Have all lib files been properly installed where they need to be installed?
From Nvidia CUDA site [ http://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#general-cuda-known-issues ]
Mac OS X lib Files
lib/
libcudart.dylib - CUDA runtime library
libcuinj.dylib - CUDA internal library for profiling
libcublas.dylib - CUDA BLAS library
libcublas_device.a - CUDA BLAS device library
libcufft.dylib - CUDA FFT library
libcusparse.dylib - CUDA Sparse Matrix library
libcurand.dylib - CUDA Random Number Generation library
libnpp.dylib - NVIDIA Performance Primitives library
libtlshook.dylib - NVIDIA internal library


Tutor's Tip: You may be able to get the best install by changing the ".dmg" extension to ".pkg" and/or using Pacifist [ http://www.charlessoft.com/ ] to perform the install.

3. Are your drivers hanging out and mixing with older/different driver versions, i.e., Is this your one and only CUDA install and do you have the latest drivers installed and no earlier versions in the mix?
From Octane User manual [ http://render.otoy.com/downloads/OctaneRenderUserManual.pdf ]
OctaneRender™ Won't Open Due to "No Cuda Devices" Error Message
Problem: Incorrect Driver installed
Solution: Read the release notes to ensure that you have the correct driver version installed.
Attempt to remove the old driver versions and then install the proper version. It may be necessary to use a tool such as Driver Sweeper to get all driver components uninstalled. This might require individual search and destroy missions.


4. Are your drivers taking a nap?
From Nvidia CUDA site [ http://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#general-cuda-known-issues ]
Mac OS
Device code linking does not support object files that are in Mac OS fat-file format. As a result, the device libraries included in the toolkit (libcudadevrt.a and libcublas_device.a) do not use the fat file format and only contain code for a 64-bit architecture. In contrast, the other libraries in the toolkit on the Mac OS platform do use the fat file format and support both 32-bit and 64-bit architectures.
At the time of this release, there are no Mac OS configurations available that support GPUs that implement the sm_35 architecture. Code that targets this architecture can be built, but cannot be run or tested on a Mac OS platform with the CUDA 5.0 toolkit.
... .
(Mac OS) When CUDA applications are run on 2012 MacBook Pro models, allowing or forcing the system to go to sleep causes a system crash (kernel panic). To prevent the computer from automatically going to sleep, set the Computer Sleep option slider to Never in the Energy Saver pane of the System Preferences.
(Mac OS) To save power, some Apple products automatically power down the CUDA- capable GPU in the system. If the operating system has powered down the CUDA-capable GPU, CUDA fails to run and the system returns an error that no device was found. In order to ensure that your CUDA-capable GPU is not powered down by the operating system do the following:
Go to System Preferences. Open the Energy Saver section. Uncheck the Automatic graphics switching box in the upper left.

5. Are you a demo user?
If so, please check out this URL: http://render.otoy.com/forum/viewforum.php?f=25 .
 

Attachments

  • MyOctanePreferencesWindowCapture.JPG
    MyOctanePreferencesWindowCapture.JPG
    58.7 KB · Views: 151
Last edited:
More Content/Octane output for your review

... . Are people writing their own or just donating materials creating from existing bundled shaders?

Both. Here's a pic of how the LiveDataBase is structured. There's also a subforum for sharing resources like textures, models and HDRI maps: http://render.otoy.com/forum/viewforum.php?f=21

Some of the gallery pieces on their site look great. It would be cool to see a real scene..

Here's the user's forum gallery (which is different from the Gallery [ http://render.otoy.com/gallery.php ] which I believe that you're referencing) where there are hundreds, if not thousands, of example art works created by Octane users (and there are often discussions of the process in the links from those 73 pages): http://render.otoy.com/forum/viewforum.php?f=5 .
 

Attachments

  • LiveDBCapture.JPG
    LiveDBCapture.JPG
    36.9 KB · Views: 182
Here's the user's forum gallery (which is different from the Gallery [ http://render.otoy.com/gallery.php ] which I believe that you're referencing) where there are hundreds, if not thousands, of example art works created by Octane users (and there are often discussions of the process in the links from those 73 pages): http://render.otoy.com/forum/viewforum.php?f=5 .

Looking through those user posts, I get the impression that octane is lacking a subsurface scattering shader or monolithic shader component (as most renderers use shaders that bundle simple shaders). If I get an NVidia card, I wish to test it myself to see how their shaders hold up split into passes, see what their sampler is like. It seems to do ridiculously well with indirect illumination. I can't find much in terms of hair or fur. I clicked on a linke that mentioned raspberries, but they lacked any kind of translucence. A couple other ones lacked it where I would expect it, which is why I'm wondering if it lacks decent SSS.

If you ever care to post any, I'm still curious about what you've done with octane. I think it's just that I wonder what you do with those insane computing rigs:D.
 
I got both of them and it's been my observation that my GTX Titan 6G cards are, on average, about equal to 1.7x GTX 580 Classified 3G in OCL and CUDA rendering in Blender. In what specific use are you interested?

Thank you, yes
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.