Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Awesome. Would you be willing to share details about chassis, drivers, etc?
Are multiple GPU's possible/practical? I'm thinking a Titan Black to start. I wonder how it might stack-up against the dual D700's, on processes that can use AMD and/or CUDA.'
Thx

Nando posted a link a few posts ago to "my" thread at Techinferno: http://forum.techinferno.com/diy-e-...80@16gbps-tb2-akitio-thunder2-osx10-10-a.html

There you can get the infos, MVC has also posted there.

@MVC: today I'll install Mavericks (if I ever find the power supply for my ext. HD :( ).

edit: tried it, but my system refuses to load from my Mavericks install stick.
 
Last edited:
It works !

nMP with D300s can DOUBLE OpenGl power with a GTX780, and it even has boot screens !

And finally, CUDA.
 

Attachments

  • Screen Shot 2015-01-07 at 9.02.31 PM.png
    Screen Shot 2015-01-07 at 9.02.31 PM.png
    151.4 KB · Views: 149
  • Screen Shot 2015-01-07 at 8.34.00 PM.png
    Screen Shot 2015-01-07 at 8.34.00 PM.png
    73.1 KB · Views: 157
  • Screen Shot 2015-01-07 at 9.07.50 PM.png
    Screen Shot 2015-01-07 at 9.07.50 PM.png
    72.4 KB · Views: 170
  • Screen Shot 2015-01-07 at 9.38.13 PM.png
    Screen Shot 2015-01-07 at 9.38.13 PM.png
    86.6 KB · Views: 149
Expansion Chassis

How much of an increase might I get over D700's? Can two CUDAs work in an expansion chassis? Which one?
Thanks.
 
Apple more than likely is shutting them down so they can monetise their own future eGPU products when the Intel platform is ready for them to enable it properly - Intel Skylake processors and chipset next year with TB3 and PCIe 3.0 support when the logic board will present that PCIe 3.0 port to the operating system, be in on a Macintosh or Windows platform.

They seem to be doing exactly the same preparation for this as I expect they would do with a future product - its up to others to do the same realising this. Fail to prepare - prepare to fail.

No, no, no, and no.

The OpenGL stack is totally not set up for external GPUs, and I've received no indication this is changing. Regardless of TB3. (Well, at least for things like hot swap, which would be a requirement for Apple.)

Never attribute to malice that which is adequately explained by stupidity.

Or at least just Apple not caring instead of stupidity.

You think Apple is going to bother working on a Thunderbolt display with a built in eGPU where you have to shut down to unplug or replug? C'mon.
 
You think Apple is going to bother working on a Thunderbolt display with a built in eGPU where you have to shut down to unplug or replug? C'mon.

I'd never have thought they'd create a laptop you had to log out of and then log back in to change which graphics subsystem routed to the built in display, but apple does always seem to come up with surprises.

Did a software update ever enable on the fly changes with those early dual system MBPs, or did they stay the "you bought version 1?" solution?
 
I'd never have thought they'd create a laptop you had to log out of and then log back in to change which graphics subsystem routed to the built in display, but apple does always seem to come up with surprises.

Did a software update ever enable on the fly changes with those early dual system MBPs, or did they stay the "you bought version 1?" solution?

No.

OS X can deal with GPU changes when both cards always stay connected. The capability has been there for a long long time.

Dealing with GPU changes when the active card has been disconnected is entirely another situation, much much harder to deal with. Windows can deal with it, but it would take a pretty good amount of effort to make OS X's OpenGL stack to make that work.

I would love to see it happen, but just ask yourself how much effort Apple tends to put into OpenGL, and that'll give you your answer on this being a thing that becomes possible.

The alternative is Apple building a display for laptops that when you hot plug, your entire system crashes and you lose any unsaved work. Chances of that? Not happening. The way things are set up now every process that is currently actively working on the GPU will crash, at the least. Apple could maybe at least get the driver not to crash, but that's the least of your worries if the entire window server still goes down.

At least on the 2008/2009 Macbook Pros you could make the user gently log out first.
 
No.

OS X can deal with GPU changes when both cards always stay connected. The capability has been there for a long long time.

Dealing with GPU changes when the active card has been disconnected is entirely another situation, much much harder to deal with. I would love to see it happen, but just ask yourself how much effort Apple tends to put into OpenGL, and that'll give you your answer on this being a thing that becomes possible.Windows can deal with it, but it would take a pretty good amount of effort to make OS X's OpenGL stack to make that work.

Windows can deal with that, and with hot-plugging PCIe cards, memory and CPUs. Same with being able to update drivers and firmware online without any interruptions.

Why - because Windows users demanded it. Plain. Simple.

Apple users seem to be happy to passively accept whatever the Lords of Cupertino give them.

Y'all need to tell Cupertino that 20th century support doesn't cut it any more.
 
May have some interesting 4K info coming.

Naturally the nMP got here while I have a national spot to Art Direct.

Will get back to testing over weekend.

Think happy thoughts for me re: Dogs.
 
Time for some Octane !

With one 780/6GB.

Now where is that GTX980 mEFI I tested awhile back....

Or should I try 3 780s on nMP?

So many options, so little time.

If I did this right, looks like a single 780 6GB is faster than 2 @ GTX680s and a little slower than 2 @ 680s and a Q4000.

http://barefeats.com/gpu680v5.html

EDIT: Added results for single 680 4GB. 3:10, just about twice as long as 780, but right in line with BF tests.
 

Attachments

  • Screen Shot 2015-01-10 at 4.12.13 PM.png
    Screen Shot 2015-01-10 at 4.12.13 PM.png
    928.3 KB · Views: 227
  • Screen Shot 2015-01-10 at 4.45.30 PM.png
    Screen Shot 2015-01-10 at 4.45.30 PM.png
    898.6 KB · Views: 221
Last edited:
Windows can deal with that, and with hot-plugging PCIe cards, memory and CPUs. Same with being able to update drivers and firmware online without any interruptions.

Why - because Windows users demanded it. Plain. Simple.

Apple users seem to be happy to passively accept whatever the Lords of Cupertino give them.

Y'all need to tell Cupertino that 20th century support doesn't cut it any more.

Yes, it would be great if Cupertino would pay attention, but it's also not nearly that simple...

If you look at Windows, it only really supports live GPU switching between GPUs of the same brand. Nvidia Optimus doesn't work with AMD GPUs. That's because each GPU has it's own unique extensions. What happens if you swap to an AMD GPU from an Nvidia app while you're running an app that relies on Nvidia extensions? Blam! 1's and 0's everywhere, and lost data.

I know Windows can recover from driver faults, but I'm not entirely sure what they guarantee over there. I don't think they necessarily promise an application using the GPU won't crash. I know on OS X that would be very hard to do. Apple might be able to pad around it (using RAM as a buffer), but most applications assume they're continuing to work with the same chunk of VRAM. What happens when you're using a 6 gigabyte card full of data and you switch to a 1 gigabyte card? Apps don't know how to deal with that. What happens if you're doing a bunch of CUDA stuff and suddenly the card swaps to AMD? What happens when you have a lot of very important data that is only in VRAM on a card, and suddenly the user unplugs the card? How does that data make it over to the new card?

Look, I don't mean to trash anything MacVidCards is doing, I think it's totally valuable for power users that know what they're doing. But the idea that Apple is somehow locking this all down in advance of launching their own solution? That just sounds absolutely nuts to me given all the problems here. Even if Apple wanted to solve them I'm not entirely sure they are solvable without a lot of changes to OpenGL and a lot of app developers recoding their applications. And there are a lot of applications.

Windows (and OS X, for that matter) is able to do a lot of these things because they only promise very specific situations will work, and almost always that requires sticking to one brand of gear that remains internal to the machine. When you open it up to GPUs that can be unexpectedly removed at any time, along with any combination of different GPUs, things start to get very very messy.
 
Yes, it would be great if Cupertino would pay attention, but it's also not nearly that simple...

If you look at Windows, it only really supports live GPU switching between GPUs of the same brand. Nvidia Optimus doesn't work with AMD GPUs. That's because each GPU has it's own unique extensions. What happens if you swap to an AMD GPU from an Nvidia app while you're running an app that relies on Nvidia extensions? Blam! 1's and 0's everywhere, and lost data.

I know Windows can recover from driver faults, but I'm not entirely sure what they guarantee over there. I don't think they necessarily promise an application using the GPU won't crash. I know on OS X that would be very hard to do. Apple might be able to pad around it (using RAM as a buffer), but most applications assume they're continuing to work with the same chunk of VRAM. What happens when you're using a 6 gigabyte card full of data and you switch to a 1 gigabyte card? Apps don't know how to deal with that. What happens if you're doing a bunch of CUDA stuff and suddenly the card swaps to AMD? What happens when you have a lot of very important data that is only in VRAM on a card, and suddenly the user unplugs the card? How does that data make it over to the new card?

Look, I don't mean to trash anything MacVidCards is doing, I think it's totally valuable for power users that know what they're doing. But the idea that Apple is somehow locking this all down in advance of launching their own solution? That just sounds absolutely nuts to me given all the problems here. Even if Apple wanted to solve them I'm not entirely sure they are solvable without a lot of changes to OpenGL and a lot of app developers recoding their applications. And there are a lot of applications.

Windows (and OS X, for that matter) is able to do a lot of these things because they only promise very specific situations will work, and almost always that requires sticking to one brand of gear that remains internal to the machine. When you open it up to GPUs that can be unexpectedly removed at any time, along with any combination of different GPUs, things start to get very very messy.

So the answer is to sell an entire line of computers that ages twice as fast as anything that can change a GPU? And never try to use a TB eGPU so you never have to worry about what happens if you foolishly try to unplug it while using it?

Have a look at the Yosemite/Mavericks/ML forums. Dozens of frustrated iMac and MacBook users stuck in older OSs primarily because the GPU doesn't have 64 bit drivers.

Apple successfully is FORCING new purchases through this mechanism.

What's the solution to the myriad problems you just listed?

DON'T UNPLUG THE GPU WHILE USING IT !!!

To me, this is up there with "Don't unplug a drive while writing data to it" and "Don't blow dry your hair while standing in a bathtub full of water" and "Don't drain the oil from your car engine while it is running"

What is the HUGE DRIVING FORCE to be able to unplug a GPU? Why do you keep bringing it up? Where are the posts from people demanding a GPU they can unplug and remove from computer while it is running?

That's right, there aren't any.

All of those problems go away if you just use your brain instead of insisting on unplugging an eGPU while using it.

This is why I frequently think you are a voice from Cupertino. Who cares about unplugging an eGPU while using it? How about "Do people want fire that can be fitted nasally?" What other silly requirements can we list that have no rational connection to how people actually use a computer?

Don't do a dumb, silly thing and an eGPU means you can use OctaneRender on a Mini. Wow, a revelation.

But if you use Apple/Intel's requirement that all TB devices MUST be hot-pluggable then an eGPU doesn't make the grade. (What happens if you unplug your RAID from TB while it is writing? How does that work but not an eGPU?)

So, insist that ALL TB devices MUST BE HOT PLUGGABLE and deny yourself an eGPU on moral grounds.

Or use your brain, don't jerk the plug out while it is running and allow yourself to update the 3+ year old GPUs in your nMP.

Hmmm....
 
Yup, I just want a Mac Mini with a graphics card that can drive 3 displays with hardware acceleration. Hot-plugging an eGPU is no more interesting to me than hot-plugging a power supply.

The question about hot-plugging I have is this - if the eGPU enclosure is the first device on your thunderbolt chain, I'm assuming you can still hot-plug downstream devices, since monitors plugged into the eGPU are connected to the graphics card directly, rather than the TB port.
 
...
Windows (and OS X, for that matter) is able to do a lot of these things because they only promise very specific situations will work, and almost always that requires sticking to one brand of gear that remains internal to the machine. When you open it up to GPUs that can be unexpectedly removed at any time, along with any combination of different GPUs, things start to get very very messy.

Hot removal almost always requires some sort of quiescing the device to be removed.

If you remove CPUs - you warn the scheduler, and wait for processes to be migrated. If you remove memory - you warn the memory management system, and wait for the DIMMs to be freed up.

(Hot add is much easier - new RAM or CPUs appear and become available.)

MVC's reply is almost perfect - why prevent something just because there are scenarios where you can hurt yourself? If that logic were followed to the end - you'd have no USB or T-Bolt because it's possible to unplug a cable at an unfortunate time.

Idiot-proof computers are less useful than ones that have some simple guidelines for usage. (Don't remove the disk while you're writing to it, don't unplug a 3072 core GPU while running a GPGPU app, don't unplug the network while surfing MacRumours...)
 
thunderbolt should of really implemented a locking mechanism during its development like ethernet.

I agree, and while I'm still waiting for my nMP to be delivered, I've noticed the TB ports on my MBP and both of my TB peripherals are extremely tight once a cable is in there, and I can't see how I would be able to accidentally disconnect them. Maybe the nMP is different or something, because I've read others on this forum commenting on how easy it is to disconnect the cables from the ports.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.