Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Zwhaler

macrumors 604
Original poster
Jun 10, 2006
7,287
1,999
I'm out of PCIe slots and still want more, is there any way to do this externally without Thunderbolt? Thank you.
 
Ah wicked... I was hoping to find something like that a bit cheaper, even the TB2 boxes for the nMP I thought were expensive and this is way more! Thanks for posting, anyway.
 
I may have a much cheaper solution for you to try out for about 5% the cost of that Expressbox if you're interested?

Send me a message and I'll send you some details.
 
I may have a much cheaper solution for you to try out for about 5% the cost of that Expressbox if you're interested?

Send me a message and I'll send you some details.

I'm very interested as well. pm inbound!
 
Hey guys
I got your messages, thanks

I haven't responded yet since I'm still researching my theory, so please bear with me.

I was looking at an expansion myself, mainly for graphics cards for SLI and extra power so I don't stress my Mac Pro out.

After reading this topic I followed the link to the system for nearly $3k and looking close I noticed they use a card I was planning to use and even the cabling (exact).

What I see in that tower are basically a power supply, interface card (which would talk to the expansion card in your Mac Pro) and a PCI Expansion board, am I right?

I know of a casing that has 6 double PCIE slots with a built in Power Supply and Interface Card.

The Interface Cards I've found are available in 4x, 8x and 16x.

If you guys think I'm right I know a guy that's selling the towers, he's actually a member here for $120 + Shipping

You can get the Interface Cards for $30-60

Interface Cabling $50-80

There's obviously Mac Compatible drivers for the Interface Cards since that company uses the same ones.

Let me know what you guys think, I was actually planning to buy the above gear to run some tests.
 
Would you mind to share some more details about parts? I'm looking for DIY solution as well. I think that all of PCIe expansion chassis are built in similar way to below diagram. Correct me if I'm wrong.

600_2707blk.gif
 
I recently bought the Netstor GPU enclosure only to find my 2010 dual hexacore MP couldn't make use of its PCI slots. It would only ever recognise two cards in the box. There is a limited amount of GPUs the Mac can support, and it also depends on the GPUs you have. I've done a stack of research on this and for some reason the current version of OSX has hobbled the system. People running multiple GTX Titans in a Netstor or Cubix now find they can only access two of them - it's worth searching elsewhere on these forums for Cubix.

I simply do not know enough about PCIe to figure out what's going on, but the Netstor and GTX 780 GPUs were returned to their vendors and I took a hit on the whole project. Beware: the old Mac Pro in its current OSX configuration simply will not read unlimited amounts of GPUs, OSX seems to throttle the amount of resource the PCIe system can handle with certain cards.

For example, the MP will recognise the Cubix Destop Expander and two GTX Titan Blacks, but only one older card like a GTX Titan or GTX 580. Users have reverted to OS 10.8.5 to make their systems work again.

And Cubix hasn't even tested it with Yosemite yet... For me the whole thing is just too flaky and unreliable at the moment.

I would love to use external GPU rendering, but it might be cheaper and easier to just buy a multi GPU PC and network it using Octane Render - if you can stomach having a PC in your house.
 
Would you mind to share some more details about parts? I'm looking for DIY solution as well. I think that all of PCIe expansion chassis are built in similar way to below diagram. Correct me if I'm wrong.

Image

Thanks for your input on this, that is exactly my thoughts.

Like Steve mentioned just after you, there seems to be some driver issues which I thought would be a barrier on my quest.

Original topic poster mentioned he wanted more pcie for storage, so maybe that wouldn't be an issue, but it's all about the Mac Pro using the interface.

My idea was using the Nvidia Plex tower which houses a power supply, 6 single (3 dual) card slots and fans with of course the controller card.

I know for a fact Nvidia do the drivers for Windows and Linux, I guess its just issues with other arrangements with the Mac.

My idea is to run 3 GTX cards in SLI mode for use with Windows, mainly for gaming, but I'm guessing issues may still arise.

This is the tower I'm speaking about:
Quadro_Plex_2100_D2.jpg


The controller cards are also Nvidia Branded, as well as the cables.

I know some companies charge a fortune for something like this, that said I know the Plex units were over $10k when originally released for both the Tesla and Quadro range.
 
Hmm, I'm afraid that these enclosures work only with dedicated Quadro cards that were sold with. AFAIK these Quadros were running a different BIOS than regular ones. Another mystery would be drivers support for interface cards. I've never seen any report about using any Plex box with OS X.

BTW, 2200 D2 is still available on order in few shops in my country. For equivalent of $13k...
 
My understanding of the issues brought out in 10.9.3 and multiple cards is that it has something to do with the OS itself and "bar mappings."

Pretty convenient timing....suddenly the cMP has been hobbled to nMP stats of 2 GPUS only.

I do think it is odd that it would affect Titan differently than Titan Black so until we get final answer, not sure what to think.

I happen to have 4 of those Plex housings right now. They each contain 2 @ Quadro 5600s. I didn't get the interface card with them, just the units themselves. I'll contact my source and see if he has any more pieces.

I've always wondered if they could be used in other set ups but didn't have much hope. 5600s were PCIE 1.0 I'm pretty sure so these particular ones may be limited that way. Also the sea sonic power supply has no 8 pin connectors but there are untapped pins on outputs.

It just seems that if there was an easy and cheap way to do this someone would have found it by now.

I think a guy called BDM Studios has been posting about Cubix issues.
 
Any update with your prototype rig?

Thanks for your interest in my project, unfortunately I don't have funds at the moment to try things out, but I have lots of idea's involving graphics and storage etc.

I was thinking about asking MacVidCards if he'd be willing to sell me the guts of one of his units to mess with (no chassis or PSU), but I doubt he'd do it to be honest, he's a tough cookie :D

I'll let you know one day I guess.
 
Hey guys
I got your messages, thanks

I haven't responded yet since I'm still researching my theory, so please bear with me.

I was looking at an expansion myself, mainly for graphics cards for SLI and extra power so I don't stress my Mac Pro out.

After reading this topic I followed the link to the system for nearly $3k and looking close I noticed they use a card I was planning to use and even the cabling (exact).

What I see in that tower are basically a power supply, interface card (which would talk to the expansion card in your Mac Pro) and a PCI Expansion board, am I right?

I know of a casing that has 6 double PCIE slots with a built in Power Supply and Interface Card.

The Interface Cards I've found are available in 4x, 8x and 16x.

If you guys think I'm right I know a guy that's selling the towers, he's actually a member here for $120 + Shipping

You can get the Interface Cards for $30-60

Interface Cabling $50-80

There's obviously Mac Compatible drivers for the Interface Cards since that company uses the same ones.

Let me know what you guys think, I was actually planning to buy the above gear to run some tests.

Thanks for your interest in my project, unfortunately I don't have funds at the moment to try things out, but I have lots of idea's involving graphics and storage etc.

I was thinking about asking MacVidCards if he'd be willing to sell me the guts of one of his units to mess with (no chassis or PSU), but I doubt he'd do it to be honest, he's a tough cookie :D

I'll let you know one day I guess.


You can't choke up $200 to figure it out and somehow that's my fault?

Nice one.
 
Anybody make progress on this? Wondering about doing this now. Seems like something a person could make themself easily today.
 
My idea is to run 3 GTX cards in SLI mode for use with Windows, mainly for gaming, but I'm guessing issues may still arise.

Westmere or Nehalem CPUs and PCIE 2.0 will take a hit of up to 10% per GPU and struggle to feed data to three cards across one single 16X lane connected to the external enclosure. Then you have to find a case that is SLI compliant. This is futile and expensive when you could build a new Skylake computer for less cost and put your GPUs in there.
 
FYI i have this running, just saw this thread.

I split Slot 4 into 4 dedicated x1 slots currently, using a PLX PEX8608 (4 in, 4 out, 2.0).

This means a PLX behind another PLX (Slot 3+4, does anyone know what chip that is or do i need to boot linux?) works, at least in a 1:1 routed mode.

I attached the 4 x16 adapter boards to the back of the Mac and expand them into 4 x1 2.0 to the top to a 4x x16 mount for 2 slot cards (mostly GPUs).

A 2.0 x1 to 3x x1 1.0/1.1 switched design (PLX, unknown chip as marking is not normal) was not detected at all by OSX or Grml (Linux 4.*) but works fine on another non-Apple system (a GPU cluster, Sandy Bridge, Gigabyte), i tried both the PLX slots 3/4 and the normal 1/2.

Pictures here:

https://imgur.com/a/Uz6cj
 
FYI i have this running, just saw this thread.

I split Slot 4 into 4 dedicated x1 slots currently, using a PLX PEX8608 (4 in, 4 out, 2.0).

This means a PLX behind another PLX (Slot 3+4, does anyone know what chip that is or do i need to boot linux?) works, at least in a 1:1 routed mode.

I attached the 4 x16 adapter boards to the back of the Mac and expand them into 4 x1 2.0 to the top to a 4x x16 mount for 2 slot cards (mostly GPUs).

A 2.0 x1 to 3x x1 1.0/1.1 switched design (PLX, unknown chip as marking is not normal) was not detected at all by OSX or Grml (Linux 4.*) but works fine on another non-Apple system (a GPU cluster, Sandy Bridge, Gigabyte), i tried both the PLX slots 3/4 and the normal 1/2.

Pictures here:

https://imgur.com/a/Uz6cj
Wow, that's wild. What makes you need so many graphics cards? I would just make sure your chipset has enough available lanes to support them correctly.
 
Wow, that's wild. What makes you need so many graphics cards? I would just make sure your chipset has enough available lanes to support them correctly.

I don't, but it is an interesting dive into PCIe design and i like to have the slots on top for testing.

Currently i run 6 displays on 2 cards, one in Slot 1 @ x16 and one external at x1.
 
Oh, yea, this works also if you have no need for the wifi card or use USB 2.0/3.0 (this is a R9 285/380, which is an interesting thing in it's own but not relevant here now...):



----

AMD R9 xxx:


Chipset Model:AMD R9 xxx

Type:GPU

Bus:pCIe

Slot:AirPort

PCIe Lane Width:x1

VRAM (Total):4096 MB

----
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.