Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm starting to play around with water cooling in my gaming rig. It's an ASUS 2011 board with a "little" 3820, which I've overclocked to 4.5GHz. Along with the up-clocked CPU, I have 2 SuperClocked EVGA Titans in the rig.

Those babies move some air. Seriously. While playing Battlefield 3 across 3 1920x1200 LCD panels (so, essentially: 5760x1200), the impeller fans on those two cards make enough noise such that my office sounds like a data center. Fortunately, I have headphones on when I'm gaming. But the sound is deafening if I don't have them on.

I installed a closed-loop (ie, pre-filled) copper radiator/copper heat block on my CPU a few weeks ago. It came with 2 120mm fans, but my case and motherboard made it impossible for me to fit both. So I have a single 120mm fan blowing through a 2x120mm-sized radiator. It keeps the CPU's temps under 60ºC easily, even under full load.

Next: the Titans. I'm slowly assembling the pieces to begin the water-cooling adventure with them. I already have copper water blocks and an external copper radiator/reservoir/pump combo unit purchased. Once I have the right tubing and fittings, I'll get to work.

Why am I mentioning all of this? I think some or all of it could theoretically apply to the Hack. I know people generally shy away from water-cooling server chips because "they don't need it!" But: they're socket 2011, and there's no reason why a water block won't fit. Certainly the video card(s) can use better cooling, as well as something that will keep your system a lot quieter.

Aside from the obvious 2 downsides of water cooling (cost and potential hazard), the not-so-obvious one is: it sort-of locks your system in as configured. It becomes... challenging... to swap pieces and parts out once they're installed with water blocks all connected together. It's not impossible to do, just challenging. So that has to be kept in mind.

Gee got any other surprises lurking?! That sounds like an awesome rig/project!

There are quite a few who do watercool these machines, most notably as part of this hack community are PunkNugget, and braindeadmac. Boxxtech offer watercooled solutions for their top dual CPU workstations and there's one guy over HERE who has one of the most ambitious liquid cooled SR-2 builds I've ever seen. He has no plans to run OSX though.

- - -

Ok so I got my Tyan board today and began the swap. unfortunately there are a bunch of issues. The mounting plates for the CPU coolers don't match up to my Noctuas. Also the CPU sockets sit up higher than the SR-2, making the CPU 1 Noctua heatsink butt right against the top of the case chassis. It doesn't really fit. Also the PC Z-70 case is very tall, making the power supply too far away for the 24 pin power cable to reach the top of the motherboard.

So what I was hoping would be a simple motherboard swap is going to require a new case and new coolers as well :(
 
I. How to Select An Air Cooler For The GA-7PESH3

Thanks for the info and going the extra mile to find a supplier in UK, appreciated.

I've been doing a bit of research on the case/cooling and come up with:


The Obsidian case cause it generally gets very good reviews, "exceptional" according to Anandtech, and happens to suit my style.

I started thinking about the Corsair cooling after reading this review http://www.legitreviews.com/intel-lga2011-cpu-cooler-roundup-for-sandy-bridge-e-processors_1801/14. Also I guess using all Corsair parts can't hurt for ease of install.

II. I May Have Over Ran My Headlights

Will be research/thinking about this more and hopefully come up with something suitable. Great saying, will promote over here.

On the memory side of things I could go for existing 8GB DIMMs, don't think I'm going to need more than 64GB. Either way I'll wait a month or so before fully committing. If I can resist the urge for that long!
 
If you're willing to buy used on eBay, you can get them for about half as much (there are X5680s selling in UK, and you can make a lower offer on at least two of them).

An alternative is buying a used PC workstation with the CPUs you want, swap the CPUs and resell the workstation to recover some of your cost. More hassle perhaps but probably the cheapest upgrade option.

I did have a look at building a hackintosh workstation in the past but I gave up as support seemed to be more limited for 2011 sockets MB (no power management according to tonymacx86) and information on dual-CPU configs almost non-existant.

I don't have your requirements or budget but I'd upgrade you MP first. You could even start with just a GPU upgrade and fix your slow storage issue.

Thanks for the info, much appreciated.

I had a look on eBay and could only see buy it now for £750. Given the cost, not being sure it's a good investment right now, the new Mac Pro being socket 2011, and so on I'm seriously considering taking the plunge. If I do, I'll keep the old machine as a backup until I'm happy the new one's stable.

With all the information I've had to absorb I can't remember if I've asked or seen anyone with a build like this. Either way, if I go ahead, will be sharing with everyone here.
 
Although, after doing some initial research on water cooling I'm leaning back towards fan based..

Might be worth exploring though, if anyone has any advice.

The Corsair 'all-in-one' coolers are just almost as hassle free as a fan based unit. They're are however in a separate league to a complete liquid cooling solution.

I'm fairly certain the discussion jasonvp mentioned before was aimed towards customized liquid cooling loops in which you buy a separate pump, rads, fans, fittings and tubes and work everything out yourself. It can get quite complicated but no air coolers come close to a well constructed system like this. Here's one of my favourite examples. And another. Both these systems run OSX.

The only air cooler I know of which provides nearly as much performance is the Noctua DH14 which I have - but see my notes earlier, it's so big it won't even fit in most cases!

----------

and in other news, I didn't even get as far as installing the CPUs in that Tyan before I found this:

http://farm3.staticflickr.com/2850/10061120315_6d01aa7aa7_o.jpg
http://farm8.staticflickr.com/7379/10061089334_86da9d387f_o.jpg

Looks like some bent pins to me... I've raised the issue with the seller so we'll see what happens. :(
 
I'm fairly certain the discussion jasonvp mentioned before was aimed towards customized liquid cooling loops in which you buy a separate pump, rads, fans, fittings and tubes and work everything out yourself.

The system I purchased for my gaming rig's CPU was indeed an all-in-one. No hassle, easy to bolt-in, easy to power up, and no coolant lines to cut or run. All sealed up and whatnot. And it's doing a great job of keeping the CPU cool.

For the Titans, I'm going custom. The radiator, pump, and reservoir are a single unit, but modular so they can come apart if need be. The block for the Titan is 3lbs of hardware(!) Seriously. I weighed it on my kitchen scale to verify. The Koolance quick disconnects are a bit pricey, but top quality and will make assembly/disassembly for service very easy.
 

Attachments

  • rad1.jpg
    rad1.jpg
    1 MB · Views: 153
  • rad2.jpg
    rad2.jpg
    2 MB · Views: 160
  • titan.jpg
    titan.jpg
    2 MB · Views: 220
  • fittings.jpg
    fittings.jpg
    2.8 MB · Views: 173
The system I purchased for my gaming rig's CPU was indeed an all-in-one. No hassle, easy to bolt-in, easy to power up, and no coolant lines to cut or run. All sealed up and whatnot. And it's doing a great job of keeping the CPU cool.

For the Titans, I'm going custom. The radiator, pump, and reservoir are a single unit, but modular so they can come apart if need be. The block for the Titan is 3lbs of hardware(!) Seriously. I weighed it on my kitchen scale to verify. The Koolance quick disconnects are a bit pricey, but top quality and will make assembly/disassembly for service very easy.

Looks great, would love to see some pics of the whole build when it's done. :)

With regards to my Tyan board - the seller has said "the pins don't look that bad and I should bend them back into place - once the CPU is installed they will correct themselves."

Not sure I'm comfortable with this, advice anyone? - Keeping in mind I would have to pay return shipping from Sydney back to the US.
 
Last edited:
Hi DJenkins,

Dammit! Must have water cooling now :|

The Corsair 'all-in-one' coolers are just almost as hassle free as a fan based unit. They're are however in a separate league to a complete liquid cooling solution.

I've drawn the same conclusion, sounds like it might not be appropriate, especially for dual CPU rig. Found a specialist company in the UK, once I've settled on a build, I'll get specs over to them and let them work out the details. Will definitely need some hands on help with this one.

I'm fairly certain the discussion jasonvp mentioned before was aimed towards customized liquid cooling loops in which you buy a separate pump, rads, fans, fittings and tubes and work everything out yourself. It can get quite complicated but no air coolers come close to a well constructed system like this. Here's one of my favourite examples. And another. Both these systems run OSX.

Will check out that discussion when I have a sec. Those builds are awesome, awesome, and awesome! I'll try and get in touch with those guys, see if they can offer me any advice.

The only air cooler I know of which provides nearly as much performance is the Noctua DH14 which I have - but see my notes earlier, it's so big it won't even fit in most cases!


I've heard great things about those fans. Their CPU cooler came top in the review I mentioned previously, beating most of the liquid cooled competition. If I go the liquid cooled route, I'll be bumping up to something like an Obsidian 900D, should be plenty of room.

Looks like some bent pins to me... I've raised the issue with the seller so we'll see what happens. :(

Sucks! Hope this gets sorted without causing a headache.

----------

For the Titans, I'm going custom. The radiator, pump, and reservoir are a single unit, but modular so they can come apart if need be. The block for the Titan is 3lbs of hardware(!) Seriously. I weighed it on my kitchen scale to verify. The Koolance quick disconnects are a bit pricey, but top quality and will make assembly/disassembly for service very easy.

That looks sweet! When are you planning on install? Would love to know how you get on, lessons learned, etc.
 
With regards to my Tyan board - the seller has said "the pins don't look that bad and I should bend them back into place - once the CPU is installed they will correct themselves."

Argh. The pins won't "correct" themselves because they don't actually sit in anything. The CPU rests on them. If you have a jeweler's loupe and some teeny tweezers or needle-nose pliers, then maybe you could bend them back. But the idea of it makes me twitch.

----------

That looks sweet! When are you planning on install? Would love to know how you get on, lessons learned, etc.

Ideally: Friday. I don't drink alcohol at all. A good friend of mine suggested adding a touch of (very cheap) vodka to the water I'm putting into the reservoir, to keep the system clean and disinfected. Not knowing what's "cheap" and what isn't, I have to look to him for help. As it turns out, he's also a bit of a water cooling wonk. So he's going to bring himself and some (very cheap) vodka over on Friday to help me assemble the system. The tedium involved will mainly be working with the Titans and the water blocks. The rest of it will be pretty easy.

The goals will be:
  • Lower the GPU temps. GPU 1 runs hotter than GPU 2, because that's just the way SLI works. It's generally about 10ºC hotter than GPU 2, averaging about 72ºC or so. I want to lower that as much as possible, because like Intel's CPUs, nVidia's GPUs kick up their boost when cooler. It might also let me safely overclock the GPUs even further than they are from EVGA.
  • Peace and quiet, relatively speaking. There will still be some amount of noise, like the pump, some sloshing of bubbles and whatnot in the system, and the external fans blowing air through the radiator. But I'm quite certain the noise level will be significantly reduced as compared to the two OEM fans on the video cards when under full load.

Bearing in mind that this is all for a gaming rig. Some or most of it could easily apply to a Hack.
 
The war is on. More heavy artillery is arriving. Who's wearing camouflage?

TOPIC No.1

...Here's the specs if anyones interested:

4x E5-4650 (Engineering samples for $500 each off ebay)
8x 16GB 1600Mhz RAM
2x 500Gb Samsung 840 Evo SSD in Raid0 for OS (Random Read = 988MB/s, Random Write = 846MB/s)
7x 3TB WD Caviar Green HDDs in raid5 (16.3TB)
1x GTX 590

Without the 7x HDDs, the 590 and postage it was about US$5900. Considering a full-spec Mac Pro with 2x SSD and 64GB RAM is just over US$9000...
1. One good thing about Sandy Bridge chips is that they cannot be overclocked very much (I still don't like that). So buying used ones aren't as risky as buying, e.g., used Westmeres.
2. I note from your SiSoft Sandra Benchmark [ http://www.sisoftware.eu/rank2011d/...988e9d4e1d0e7dfebd3f587ba8aacc9ac91a187f4c9f9 ] that your four E5-4650 CPUs run at a base speed of 2.7 GHz (which is standard for all of them), but that your four particular E5-4650s turbo boost higher (3.5 GHz) than standard E5-4650s (which reach only 3.3 GHz [ http://www.cpu-world.com/CPUs/Xeon/Intel-Xeon E5-4650 - CM8062107184516 (BX80621E54650).html ]). So you must have the Intel Xeon E5 4650 ES 2.7GHz LGA2011 20MB 8 Core 130W TDP 32nm C1 QBED version. Um! They're fast as Sandy Bridge E5-2680s [ http://www.cpu-world.com/CPUs/Xeon/Intel-Xeon E5-2680.html ]. A seller named xtrememicro, which is based in the United Kingdom [ http://www.ebay.com/usr/xtrememicro ], sells them for $490 on ebay.

N.B. And as a completely unrelated aside and tremendous diversion - E5-2600 motherboards also run E5-4600 CPUs.

TOPIC No.2
On a completely unrelated topic */ Supermicro is heeding the call to battle.

Here're two recent tank options:

I. 2 CPU/4 Double Wide PCIe Slotted Armored Tank (can be spray painted Sandy or Ivy camouflage)
(1) Here's the shell: Supermicro SuperChassis CSE-747TQ-R1620B 1620W 4U Rackmount/Tower Server Chassis (Dark Gray) w/dual 1620 watt PSUs [ http://www.supermicro.com/products/chassis/4U/747/SC747TQ-R1620.cfm ]. It's $887 from here: http://www.superbiiz.com/detail.php?name=CA-747TQ6B .
(2) Here's the weaponry base: SuperMicro X9DRG-QF Motherboard [ http://www.supermicro.com/products/motherboard/xeon/c600/x9drg-qf.cfm ]. It's $457 from here: http://www.superbiiz.com/detail.php?name=MB-X9DRGQB .
E5-2600 v1 or v2 dual CPUs and coolers, HDDs/SSDs, Memory and GPUs are not included in this $1344 package.


II. 4 CPU/8 PCIe Slotted Armored Tank (can be spray painted Sandy or Ivy camouflage)
(1) Here's the shell: Supermicro SuperChassis CSE-747TQ-R1620B 1620W 4U Rackmount/Tower Server Chassis (Dark Gray) w/dual 1620 watt PSUs [ http://www.supermicro.com/products/chassis/4U/848/SC848E16-R1K62.cfm ]. It's $1579 from here: http://www.superbiiz.com/detail.php...nt5rQumwVI7dieiRFfGZ/ueNlGlzL6z6YvaKPOY2KF3Uw .
(2) Here's the weaponry base: SuperMicro X9DRG-QF Motherboard [ http://www.supermicro.com/products/motherboard/Xeon/C600/X9QR7-TF-JBOD.cfm ]. It's $1173 from here: http://www.wiredzone.com/Supermicro...ard-S-2011-for-4x-Xeon-E5-4600~10022770~0.htm .
E5-4600 v1 (or soon be announced v2) quad CPUs and coolers, HDDs/SSDs, Memory and GPUs are not included in this $2752 package.

I'm trying to determine whether this system will hold four Titans because if it does, then you'd have a most powerful workstation with four Titans and four E5-4650 v1s with 32 cores (or the soon to be announced v2s with even more cores).

*/ Who's trying to read between the lines. Stop that! Those two topics are too completely unrelated because I said that they are. Remember that phrase coined by Marcus Tullius Cicero, " ipse dixit." [ http://en.wikipedia.org/wiki/Ipse_dixit ].

As a pseudo fellow Alabamian [ http://blog.al.com/strange-alabama/2012/06/can_you_spot_an_alabamian_in_a.html ] is known, by some, to have said, "That's All I Have To Say About That" - Forrest Gump {circa 1994}. The Bad Man's lips and hands are now shut tight and he'll neither speak nor scribe further related to inferences of the sort some of the more inquisitive among you may now be drawing. Don't stray. Keep out of my way. Stay away from Ebay today. And by all means, don't even think about buying one of these [ Supermicro System-8047R-7RFT+ 4U LGA2011 E5-4600 PCIE DDR3 SAS Sat 1400W 80+ RTL ] (as I've done twice) for $2237.38 from here :http://www.provantage.com/supermicro-sys-8047r-7rft~7SUPB08K.htm . ;)
 
Last edited:
On a completely unrelated topic */ Supermicro is heeding the call to battle.

Here're two recent tank options:

I. 2 CPU/4 Double Wide PCIe Slotted Armored Tank (can be spray painted Sandy or Ivy camouflage)
(1) Here's the shell: Supermicro SuperChassis CSE-747TQ-R1620B 1620W 4U Rackmount/Tower Server Chassis (Dark Gray) w/dual 1620 watt PSUs [ http://www.supermicro.com/products/chassis/4U/747/SC747TQ-R1620.cfm ]. It's $887 from here: http://www.superbiiz.com/detail.php?name=CA-747TQ6B .
(2) Here's the weaponry base: SuperMicro X9DRG-QF Motherboard [ http://www.supermicro.com/products/motherboard/xeon/c600/x9drg-qf.cfm ]. It's $457 from here: http://www.superbiiz.com/detail.php?name=MB-X9DRGQB .
E5-2600 v1 or v2 dual CPUs and coolers, HDDs/SSDs, Memory and GPUs are not included in this $1344 package.

This looks very nice... Do you see this as a suitable candidate for a Hackintosh build?
Sub Total $1,343.98 then add in a couple of these E5-2690 v2
Estimated Total (before Tax & Shipping) $5,281.96
 
This looks very nice... Do you see this as a suitable candidate for a Hackintosh build?
Sub Total $1,343.98 then add in a couple of these E5-2690 v2
Estimated Total (before Tax & Shipping) $5,281.96

It'll run all of the most popular OSes. But only Windows and Linux will it run optimally today. To run your most popular OS optimally will have to await the arrival of a product that, in name, reminds me of two of my favorite cowboys on television - the Maverick brothers: Bret and Bart [ http://en.wikipedia.org/wiki/Maverick_(TV_series) ]. For about 59 years I've been looking for Mavericks.
 

Attachments

  • BabyTutor&Horsy.png
    BabyTutor&Horsy.png
    619.9 KB · Views: 123
Last edited:
Meh, my hardware build is taking longer than expected. Bloody water cooling setup, it's my first, and so there are hiccups on the way.

For example, I did not know that full cover GPU water blocks from different manufacturers do NOT have water port holes that align, so mixing and matching does not work. So, in my case I got a XSPC Titan block and original EVGA GTX 480 Watercooled and I thought I could just bridge them straight, but noooooooo, they are off by 1/2" and so tough. @#($^%& no standards at all.

Good news is that Rampagedev got SR-X running well, and since I'm a lucky owner of one, I'm looking forward to that. :D

Still building my water loop and then there's testing it for leaks and then live test and fingers crossed I have a nice system soon...
 
How can you possess the 3d rendering power of forty E5-2687Wv1 systems for under $7K?

How You Can Possess The 3d Rendering Power Of Forty E5-2687W v1 Systems For Under $7,000

Attached is sample that you can modify to fit better with your needs and budget. This example is a basket that I built at Newegg. It's an overclockable four GPU Octane workstation for a Maya user for $5,327 ($4,840 System + $487 OctaneRender Combo) w/o monitor, keyboard, mouse, backup storage, shipping and taxes. When adequately powered, OctaneRender will render your 3d scene while you are building it, resulting in WYSIWYG scene building. The OctaneRender Combo consists of the Octane renderer and a plugin that allows you to run Octane from inside your 3d application. Users of other 3d packages can purchase an OctaneRender combo for their 3d application for the same price, unless the plugin for your 3d application is still in beta, in which case the price will be even less (80 € or about $108.46 less). Under Windows, you can use EVGA's Precision X and Control Panel to tweak the GPUs for even greater performance. Since the speed of your system's ram and CPU do not affect the speed of GPU rendering with OctaneRender, you can save some money by, e.g., purchasing a less expensive LGA 1155 CPU, less ram or less expensive ram, and/or just one or different SSDs. Gigabyte motherboards have traditionally been the most compatible ones for running non-Windows/non-Linux OSes.

An overclockable, faster renderer for complex, large scenes costs $6,527 w/o monitor, keyboard, mouse, backup storage, shipping and taxes. For faster rendering and to render more complex scenes, the additional $1,200 buys you four EVGA 06G-P4-2790-KR GeForce GTX TITAN 6GB 384-bit GDDR5 SLI Support Video Cards instead of Four GTX 780s. In OctaneRender, a tweaked GTX Titan in Windows is faster than any current Tesla card [ compare http://www.nvidia.com/object/tesla-supercomputing-solutions.html with http://render.otoy.com/faqs.php "What are the hardware requirements for Octane Render?" ]. Moreover, according to Nvidia a single Tesla K20 has 10 times the compute potential of an E5-2687W v1 [ http://www.nvidia.com/content/tesla/pdf/Tesla-KSeries-Overview-LR.pdf ]. In OctaneRender, each additional GTX card added to the mix results in a linear increase in rendering ability (see post # 798, above, for an example of how this really works). In sum, with four GTX Titans you'll have the compute potential and rendering performance of about forty E5-2687W v1 single CPU systems for under $7K. A single non-tweaked GTX Titan renders OctaneRender's Benchmark scene about 15 seconds faster than a non-tweaked GTX 780 [ http://www.barefeats.com/gputitan.html ]. The GTX Titan took 95 seconds, while the GTX 780 took 110 seconds. Thus, the GTX Titan is about 16% faster at this task. Additionally, the GTX Titan has twice the memory (6 gig) of a GTX 780 (3 gig) to handle larger, more complex scenes.

BTW - One bonus of using a motherboard like this one that has onboard video is that you can use onboard video to build your scene without taxing any of your GPUs with this task, also thereby allowing the OS run smoothly and the computer to remain more responsive while the four GPUs are tasked solely with rendering.
 

Attachments

  • FastCheap3dRenderer.pdf
    79 KB · Views: 201
Last edited:
When you pull out that kinda of comparison of render-time/$. A $1000 GPU seems like a bargain :D
 
When you pull out that kinda of comparison of render-time/$. A $1000 GPU seems like a bargain :D

I pulled some comparisons out in post #798, above, in the section: "III. Maximizing 3d Rendering Performance With Multiple GTX GPUs." The bottom line is that the cost savings, over CPU focused systems, to one engaged in income production from 3d would make multiple $1k Titans seem like they were being sold by a dollar store having a special clearance price sale.
 
Last edited:
I've been getting more into 3d rendering as of late using c4d and would like to harness the CUDA "workhorse" but feel strongly about staying in OSX. I just feel super comfortable with the OSX workflow/finder etc. Last thing I would want to do is move to Windoze but if push comes to shove?.
I saw you posted about GPU expansion boxes like example... Netstor NA255A or CUBIX GPU Expander

Question is could I simply setup one of these boxes and slap in 4 normal Titan GPU's (NON EFI) and utilize the computing power. Leaving a ATI 5870 EFI GPU card in the MPro for boot screen etc?. Just trying to find the best way to save time and money.
 
I've been getting more into 3d rendering as of late using c4d and would like to harness the CUDA "workhorse" but feel strongly about staying in OSX. I just feel super comfortable with the OSX workflow/finder etc. Last thing I would want to do is move to Windoze but if push comes to shove?.
I saw you posted about GPU expansion boxes like example... Netstor NA255A or CUBIX GPU Expander

I've never been a non-Mac user, except for the many years of my life when individuals couldn't even afford a computer and lacked the gigantic space and massive cooling required of then computers. The first time that I saw a real computer it was in a massive building that was built to house the computer (of course, there was no GUI and we punch cards - I was told that it, or an earlier incarnation of it, was used to help develop the first A-bomb). I don't dislike Apple, Inc. They're holding a lot of my money, a whole lot of it - but I try hard not to allow my personal financial interests to affect my advice. I do, however, not like Apple's , "We're all knowing approach to it's customer base" (particularly, limited PCIe slots and no Apple approved adaptations to new GTX cards so that we are left to hunt for good approaches to their use on our own). However, I believe that over 99% of Apple's current and potential customer bases don't feel as I do.

I'm glad that I never took blue pill - "Apple or die." From the beginning, I started taking the red pill. Throughout my life with computers, I've view them like the tools in my tool room that were manufactured by various manufactures, e.g., some Craftsman and others like Echo, etc. Whoever supplies what I need when I want/need it and for the right price gets my patronage. Apple pushes and shoves us into accepting (1) nothing, (2) compromises and/or (3) other systems when it comes to what we want/need in GTX cards. I've already confronted the cost and timing issues involved in moving my 3d applications across platforms. However, I'll continue to allow Apple to hold my money because, as stated above, I know that a lot of people don't feel the same as I do. Variety truly is the spice of life.

I've posted about those expansion boxes, but mainly about their being overpriced and thus only for those who say, "Mac or die - cost be damned." From a hardware point of view, one would pay more money for a four slot chassis with GTX cards than he/she'd spend for the whole Windows system that I just described or comparatively for my eight double x16 PCIe slotted Tyan Server (vs. an eight double x16 PCIe slotted external chassis), and his/her GTX cards will not run as fast in that external chassis (unless it was connected to a Windows computer). If one has the money to live with the credo of "I'm only Mac" and its resulting advantages and disadvantages, then more power to him/her and I'll still try to help him/her the best that I can. But there will be limitations to my ability to help because I'm growing weary of trying to force a Mac to do something that Apple refuses to help it to do. In fact, I'm considering hackinwining my Mac Pros to enhance their ability to run GTX cards better, but hope that doesn't leave me whining. Since my older Macs/Apples were not as much from the PC mold as my Mac Pros are, I'll leave those older systems, as well as my MBPs, as they are.

I hope that no one perceives that anything that I've written here I've intended as a putdown because I am not putting anyone down. I'm cheap and you're welcome to view this as just one of the cheap Bad Man's rants. That's all I've got to say about that.

Question is could I simply setup one of these boxes and slap in 4 normal Titan GPU's (NON EFI) and utilize the computing power. Leaving a ATI 5870 EFI GPU card in the MPro for boot screen etc?. Just trying to find the best way to save time and money.

My understanding (I've never worked with a modern external chassis, although I have an external Nubus chassis for my older Apples) is that you would have to do whatever Mac users have to do to run Titans (no EFI = without a boot screen), such as having the appropriate Nvidia GTX driver and CUDA library and driver versions installed and making sure that your chassis has sufficient PSU power for as many Titans as you install. Leaving an ATI 5870 in your Mac is a great idea, not only for giving you a boot screen, but as I've indicated in previous posts, the maker of OctaneRender recommends that you have a GPU dedicated to acting solely as a video display, thereby allowing the OS to run smoothly and the computer to remain more responsive while the GTXs are tasked solely with rendering.

BTW - I searched Wikipedia and elsewhere for animated movies and, for those articles that went into detail about production, I saw that anywhere from 300 to thousands of computers were used at various stages of production, without even a hint of near realtime rendering/display. And, e.g., 1) "The Weta data center got a major hardware refresh and redesign in 2008 and now uses more than 4,000 HP BL2x220c blades. ... . " http://www.datacenterknowledge.com/archives/2009/12/22/the-data-crunching-powerhouse-behind-avatar/ , 2) For the Croods ... DreamWorks relie[d]on 3,000 HP BladeSystem c-Class server blades combined with a server render farm consisting of 20,000 processors [ http://www.geek.com/news/dreamworks...on-compute-hours-creating-the-croods-1544214/ ] and 3) "For example, it took Pixar two years to completely render all 114,000 frames of the 77 minute film Toy Story using a render farm with inherent parallelism." [ http://cloudtimes.org/2013/01/19/cloud-case-studies-the-animation-industry/ ] . On the one hand that fortifies my inclination to bias my systems toward greater GTX GPGPU CUDA ability and, yet, on the other hand, the rapidity of change makes me reluctant to advocate that someone who wants to emulate one of my older self-built systems, such as my 2010 and 2011 Hacks that get Geekbench 2 scores of 40,100 and 40,050, respectively, pursue courses of actions that I took in the not too distant past. As with most choices, there're pros and cons and things coming down the pike that can drop just after one has pulled the trigger on something, immediately obsoleting it. But 60 years of living in this shell is making me become more and more comfortable with the truth that change will come no matter how I choose to handle it. But to me that better justifies my frugality so that I can take advantage of those things that caught me by surprise. Sincerely and for so long as I am permitted to occupy my current shell, the cheap Bad Man.
 
Last edited:
I've been getting more into 3d rendering as of late using c4d and would like to harness the CUDA "workhorse" but feel strongly about staying in OSX. I just feel super comfortable with the OSX workflow/finder etc. Last thing I would want to do is move to Windoze but if push comes to shove?.
I saw you posted about GPU expansion boxes like example... Netstor NA255A or CUBIX GPU Expander

Question is could I simply setup one of these boxes and slap in 4 normal Titan GPU's (NON EFI) and utilize the computing power. Leaving a ATI 5870 EFI GPU card in the MPro for boot screen etc?. Just trying to find the best way to save time and money.

Yes you should be able to use an expansion box just fine for 3D rendering. People have been doing this with Mac Pros and DaVinci Resolve for ages. It's basically a transparent addition to your PCIe slot line-up; your host Mac Pro just thinks there are more PCIe slots. However take a close look at the model you actually go for - even though they have x16 sized slots, some only run at x4 speed and other models at x8 speed per slot max if you want to pay the price premium. That little caveat of "x16 electrical" means they will power a true x16 speed card but the actual data throughput is not x16.

From the expansion chassis they are usually plugged into a x8 speed slot in your mac pro via the host card. Essentially the bandwidth from the host card's x8 lane slot is being split between the four cards. So you may need to investigate how the actual performance hit of running at lower bandwidth will impact your GPU cards. Could affect your choice of GPU and expansion chassis.

----------

How You Can Possess The 3d Rendering Power Of Forty E5-2687W v1 Systems For Under $7,000
.... Moreover, according to Nvidia a single Tesla K20 has 10 times the compute potential of an E5-2687W v1 [ http://www.nvidia.com/content/tesla/pdf/Tesla-KSeries-Overview-LR.pdf ].

That's a great little info bite right there Tutor, I was always wondering how many GPUs it would take to outgun our little CPU netrender gang of 60+ Mac Pro and iMac cores at work. Hard to comprehend that setup could be overpowered by a few GPUs in a single machine!
 
Yes you should be able to use an expansion box just fine for 3D rendering. People have been doing this with Mac Pros and DaVinci Resolve for ages. It's basically a transparent addition to your PCIe slot line-up; your host Mac Pro just thinks there are more PCIe slots. However take a close look at the model you actually go for - even though they have x16 sized slots, some only run at x4 speed and other models at x8 speed per slot max if you want to pay the price premium. That little caveat of "x16 electrical" means they will power a true x16 speed card but the actual data throughput is not x16.

From the expansion chassis they are usually plugged into a x8 speed slot in your mac pro via the host card. Essentially the bandwidth from the host card's x8 lane slot is being split between the four cards. So you may need to investigate how the actual performance hit of running at lower bandwidth will impact your GPU cards. Could affect your choice of GPU and expansion chassis.

Yes you make a valid point - the "PIPE" is only so big. The more its compared and weighed, going the route of a pc like the Supermicro workstation SY-747GTRF Or SY-74GRTPT makes more sense when you can expand up to 4 x16 double width GPU cards. Which posses another question... Would it be possible to have a workflow using a MPro running the c4d and harness aforementioned Workstation gpu resources thru 10gbe, Ethernet?, Raid card? or would that be a dog chasing its own tail? I admit I need to do a lot more research on render net, etc. Been so busy to get my head around this.
- Many Thanks
 
At factory w/o Overclocking Titans crush the K20X in Octane, but with O'cing ... well

Yes you make a valid point - the "PIPE" is only so big. The more its compared and weighed, going the route of a pc like the Supermicro workstation SY-747GTRF Or SY-74GRTPT makes more sense when you can expand up to 4 x16 double width GPU cards. Which posses another question... Would it be possible to have a workflow using a MPro running the c4d and harness aforementioned Workstation gpu resources thru 10gbe, Ethernet?, Raid card? or would that be a dog chasing its own tail? I admit I need to do a lot more research on render net, etc. Been so busy to get my head around this.
- Many Thanks

The attached chart illustrates another reason why I'd recommend against compromising the available performance of Titans by not running them in Windows to tweak core and memory performance with EVGA Precision X and Nvidia Control Panel. There's also some info in that chart to compare other CUDA GPUs, such as the GTX 780, which because OctaneRender was re-written to take better advantage of Kepler cards [really high single precision floating point peak performance; but lower than Fermi (GTX 400's/500's) double precision floating point peak performance], that really now shine in rendering; but Titan is still The King.
 

Attachments

  • TutorsGPUcomparison.pdf
    106.8 KB · Views: 168
Last edited:
Here's my 10 cents or points on whether Titans are overpriced.

When you pull out that kinda of comparison of render-time/$. A $1000 GPU seems like a bargain :D

Now, but in more detail. And I agree that a $1K Titan is truly a bargain, as more fully shown below.

If one's into 3d animation rendering, keep reading -

This, in detail, is how I do the CPU vs. Tesla vs. Titan GPU comparison.

1) One Tesla K20X = ten E5-2687w v1. That Tesla CPU costs at or above the $3,400 range. Also, please keep in mind that rendering with a CPU does not yield linear increases in performance even when the addition CPUs are in the same system, whereas rendering with GTX GPUs in OctaneRender does yield linear increases in performance when the GPUs are in the same system.

2) One E5-2687w v1 = 8 cores that run at 3.1 GHz at base; 6 to all cores at 3.4 GHz at stage 1 turbo; 4 or 5 cores at 3.5 GHz at stage 2 turbo; 2 or 3 cores at 3.6 GHz at stage 3 turbo; and 1 core at 3.8 GHz at stage 4 turbo. Otherwise the others are still running at base of 3.1 GHz.

3) Therefore, one Tesla K20X can deliver performance equal to that of 80 CPU cores that run at 3.1 GHz at base; 60 to all cores at 3.4 GHz at stage 1 turbo; 40 or 50 cores at 3.5 GHz at stage 2 turbo; 20 or 30 cores at 3.6 GHz at stage 3 turbo; and 10 cores at 3.8 GHz at stage 4 turbo; with the others running at base of 3.1 GHz.

4) One Titan w/o overclocking by EVGA Precision X, but with all double precision floating point peak performance enabled by Nvidia Control panel is faster than a Tesla K20X. See the chart in my earlier post.

5i) Therefore, one Titan in Windows w/o overclocking by EVGA Precision X, but with all double precision floating point peak performance enabled by Nvidia Control panel, can deliver performance better than that of 80 CPU cores that run at 3.1 GHz at base; 60 to all cores at 3.4 GHz at stage 1 turbo; 40 or 50 cores at 3.5 GHz at stage 2 turbo; 20 or 30 cores at 3.6 GHz at stage 3 turbo; and 10 cores at 3.8 GHz at stage 4 turbo; with the others running at base of 3.1 GHz.

5ii) Comparison: 836 / 735 (see my chart - compare base speed of K20X with Titan's) = 1.13741496598639; Check: 836 / 735 * 3950 = 4492.789 =~ 4,500; 1.137 * 1310 = 1489.47 = ~ 1,500 (See my chart - single & double precision floating point peak performance values for Titan and K20X)

6) At Titan's factory setting, one Titan has single precision floating point performance 1.137 times greater than that of one Tesla K20X. So the Titan effect can be expressed this way:
a) 3.1 GHz x 1.137 GHz = 3.50 GHz, all eight cores at base;
b) 3.4 GHz x 1.137 GHz = 3.86 GHz, all eight cores at stage 1 turbo;
c) 3.5 GHz x 1.137 GHz = 3.98 GHz, 4 or 5 cores at stage 2 turbo;
d) 3.6 GHz x 1.137 GHz = 4.09 GHz, 2 or 3 cores at stage 3 turbo; and
e) 3.7 GHz x 1.137 GHz = 4.20 GHz, 1 core at stage 4 turbo.
f) Otherwise the remaining cores are still running at base - 3.50 GHz.
You'd have to have a CPU with these characteristics to approximate 1/10th of the Titan's effect. The closest you'd get to this is one of Dave's overclocked E5-2687W's v2 (See post no. 774 and http://www.cpu-world.com/CPUs/Xeon/Intel-Xeon E5-2687W v2.html )

7) But that equivalency to overclocked E5-2687w's v2 is just a small part of the comparison because I've overlocked my Titan(s) by a factor of 1.49. So you'd have to then overclock those overclocked E5-2697W's by another 1.49 times. Here's the next stage of the math:
a) 3.50 GHz x 1.49 GHz = 5.20 GHz, all eight cores at base;
b) 3.86 GHz x 1.49 GHz = 5.75 GHz, all eight cores at stage 1 turbo;
c) 3.98 GHz x 1.49 GHz = 5.93 GHz, 4 or 5 cores at stage 2 turbo;
d) 4.09 GHz x 1.49 GHz = 6.09 GHz, 2 or 3 cores at stage 3 turbo; and
e) 4.20 GHz x 1.49 GHz = 6.26 GHz, 1 core at stage 4 turbo.
f) Otherwise, the remaining cores are still running at base - 5.20 GHz.
You'd have to have a CPU with these characteristics to approximate 1/10th of the overclocked Titan's effect.

8) But to be kind, let's not cut the Titan into tenths, rather let's keep it whole. So one overclocked Titan has the compute potential of ten 8-core CPUs that we have no idea of when it'll drop. So one, whole overclocked Titan has the compute potential of a CPU with these characteristics:
a) all eighty cores at base - 5.20 GHz;
b) all eighty cores at stage 1 turbo - 5.75 GHz;
c) 40 to 50 cores at stage 2 turbo - 5.93 GHz;
d) 20 or 30 cores at stage 3 turbo - 6.09 GHz; and
e) 10 cores at stage 4 turbo - 6.26 GHz.
f) Otherwise, the remaining cores are still running at base - 5.20 GHz.

9) Extending Nvidia's comparison of 10 particular CPUs to one Tesla K20X, lets use GHz equivalency and take the CPU with the highest total GHz output - that is the Xeon E5-2697 v2 that has a base speed of 2.7 GHz (12 cores * 2.7 GHz = 32.4 GHz) and a turbo speed of 3.5 GHz (3.5 GHz * 12 cores = 42 GHz) and see how one Titan stacks up to that:
a) One Titan has the total base GHz equivalency of 80 cores * 5.2 GHz = 416 GHz.
b) 416 GHz / 32.4 GHz = 12.84; So one fully tweaked $1,000 Titan is equal to approximately thirteen $2,600+ Xeon E5-2697s. 13 * 2,600 = $33,800+ for 13 Xeon E5-2697.
c) If you spent $4,000 for 4 Titans to put on a motherboard with 4 double wide, x16 PCIe slots, you have the equivalent of
(416 GHz * 4 = 1,664 GHz) / 32.4 GHz = 51.39 or 51 Xeon E5-2697s. 51 E5-2697s would cost you about $132,600. $4K vs. $132K: Um- which route would you take?


10) My WolfPackAlphaCanisLupus has, in one case, 8 Titans that have a very high CPU core rendering equivalency.
a) all 640 cores at base - 5.20 GHz;
b) all 640 cores at stage 1 turbo - 5.75 GHz;
c) 320 to 400 cores at stage 2 turbo - 5.93 GHz;
d) 160 or 180 cores at stage 3 turbo - 6.09 GHz; and
e) 80 cores at stage 4 turbo - 6.26 GHz.
f) Otherwise, the remaining cores are still running at base - 5.20 GHz.
g) 640 cores * 5.2 GHz = 3,328 GHz; 3,328 GHz (8 Titans) / 32.4 GHz (Xeon E5-2697) = ~102.72. 103 x $2,600 = $267,800. What would it cost to house and power 102 Xeon E5-2697s? Two thousand dollars to house a pair would be on the low end. So lets say, while forgetting the electrical power issue, $102,000 for the housing (102/2 = 51; 51 * $2,000 = $102,000) + $267,800+ for CPUs (or $370,000) vs. $8,000 for GPUs + about $6,000 for housing (or about $14,000 total): Um- did I take the wrong route by choosing the 1/26 path ($370,000 / $14,000 = 26.42) ? Not!

Now, doesn't a Titan seem dirt cheap as a 3d animation rendering tool and justify a desire, if not an obsession, for more double wide, x16 PCIe slots perceived to be in one Windows system? If not, then one's most likely not into 3d animation rendering.
 
Last edited:
... .That's a great little info bite right there Tutor, I was always wondering how many GPUs it would take to outgun our little CPU netrender gang of 60+ Mac Pro and iMac cores at work. Hard to comprehend that setup could be overpowered by a few GPUs in a single machine!

In light of my last post, how many Titans would it take to overpower that CPU netrender gang? In that netrender gang, how many cores are there and what are their speed?
 
In light of my last post, how many Titans would it take to overpower that CPU netrender gang? In that netrender gang, how many cores are there and what are their speed?

Hey Tutor I've been a bit tied up but I read through the comparison and your methodical approach is outstanding :)
I'm not at work but give me a sec and I should be able to tally up all our machines...

Now I did what I thought was a crude comparison a few posts back comparing GHz & cores as a total number - is this an accurate enough representation of actual CPU power? JasonVP mentioned there is a loss when 'context switching' with multiple threads which I hadn't heard of.

- - - edit: and of course in my case at work there is quite a range of generations here in which each architecture version will perform differently
 
Pasting a spreadsheet text in here didn't format well so I've inserted it as an image.

computer_list.png


Anyway the overall total is 229.20

Does not seem very impressive next to those Titan figures you posted :eek:

So, it looks like this mass of power consuming steel performs right underneath the stock settings of a single Titan under Windows. Can we approximate the difference between this and how the card functions under OSX?

----------

Ok I just couldn't get my head around a single card on my home machine outperforming every mac combined at work, so I have just bought the full version of Octane for C4D - better go back and remember how to use those lights and materials. A new Titan will soon be alongside that GTX570 in my SR-2.

Crazy how I needed a realistic point of reference to convince me - sometimes numbers on a screen are hard to put in perspective :D
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.