Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

davidf18

macrumors regular
Original poster
Sep 27, 2013
189
104
Documentation says that one must run Mac OS High Sierra in order to have the NVIDEA CUDA Drivers to work properly so that someone can run a NVIDEA eGPU for deep learning.

Programs like tensorflow support CUDA/ NVIDEA, but not AMD. Also the higher end NVIDEA GPUs are very fast, such as the 2080 Ti.


Is that a correct understanding?

Also, is there any workaround?

Finally, I guess could run a dual boot system of the current Mac OS and High Sierra.

Would it be possible to running an eGPU under a parallels VM?

Many thanks to any solutions to this problem, the need to run an NVIDEA eGPU for deep learning.
 

jerryk

macrumors 604
Nov 3, 2011
7,420
4,207
SF Bay Area
Documentation says that one must run Mac OS High Sierra in order to have the NVIDEA CUDA Drivers to work properly so that someone can run a NVIDEA eGPU for deep learning.

Programs like tensorflow support CUDA/ NVIDEA, but not AMD. Also the higher end NVIDEA GPUs are very fast, such as the 2080 Ti.


Is that a correct understanding?

Also, is there any workaround?

Finally, I guess could run a dual boot system of the current Mac OS and High Sierra.

Would it be possible to running an eGPU under a parallels VM?

Many thanks to any solutions to this problem, the need to run an NVIDEA eGPU for deep learning.

First it is Nvidia.

Second, forget about trying to run a Nvidia RTX 2080 on your mac.

Just go to http://colab.research.google.com. There you can get a free Jupyter Notebook environment (you only need a GMail account) with Nvidia GPU support (go to Notebook Settings and select GPU). The colab environment will have python, tensorflow, pytorch, numpy, etc. pre- installed so you don't have to mess with that, and can focus on learning ML instead of setup up systems.

Have fun!
 

poorcody

macrumors 65816
Jul 23, 2013
1,330
1,575
Agree with @jerryk -- start with some server-based systems to get your feet wet.

Unfortunately trying to do CUDA-based ML with a Mac is just not practical. If you want to use your own machine for model training, best to put together a Windows desktop with the Nvidia GPU(s), store it in a cold basement, and then remote to it from your Mac. We use this setup acceptably. In many ways it is better than frying your Mac running models anyway. And you can add more GPU(s), RAM, etc. as your needs increase.

And no, you cannot directly use an eGPU via Parallels. Parallels can piggy-back the Mac's connection to it, but cannot access it directly (see note at bottom of this kb article).

Note that there are some Apple ML libraries available for the Mac/iOS if you just want to start exploring...

Enjoy! Fun stuff!
 
  • Like
Reactions: jerryk

davidf18

macrumors regular
Original poster
Sep 27, 2013
189
104
Agree with @jerryk -- start with some server-based systems to get your feet wet.

Unfortunately trying to do CUDA-based ML with a Mac is just not practical. If you want to use your own machine for model training, best to put together a Windows desktop with the Nvidia GPU(s), store it in a cold basement, and then remote to it from your Mac. We use this setup acceptably. In many ways it is better than frying your Mac running models anyway. And you can add more GPU(s), RAM, etc. as your needs increase.

And no, you cannot directly use an eGPU via Parallels. Parallels can piggy-back the Mac's connection to it, but cannot access it directly (see note at bottom of this kb article).

Note that there are some Apple ML libraries available for the Mac/iOS if you just want to start exploring...

Enjoy! Fun stuff!


Thank you. I live in NYC, no basements!

I was hoping for a portable solution that I bring with me for shorter jobs working at a coffee shop or while traveling.

So in your case, you use your Windows setup in lieu of a cloud solution, correct?

Many thanks!
 

jerryk

macrumors 604
Nov 3, 2011
7,420
4,207
SF Bay Area
Thank you. I live in NYC, no basements!

I was hoping for a portable solution that I bring with me for shorter jobs working at a coffee shop or while traveling.

So in your case, you use your Windows setup in lieu of a cloud solution, correct?

Many thanks!

No Windows, No Mac setup, no computer setup

Just point a browser to the url I listed. Everything runs in the cloud on servers and displays in your browser. You can get full GPU performance while doing ML on an iPad, and I suppose, a phone, although I have never tried it from a phone.

Works fine from Starbucks, library, etc.

FWIW, in my office when I run locally I use Windows and Linux. On a nice beefy under desk server with multiple monitors, mechanical keyboards, real mouse, standing desk and such.

If you want to duplicate this in a laptop you start with something like a gaming system with Nvidia GPU (1070, 1080, 2070) and run Windows or Linux. You can run the Jupyter Notebook (which is what colab runs in the cloud) locally.
 
Last edited:

poorcody

macrumors 65816
Jul 23, 2013
1,330
1,575
Thank you. I live in NYC, no basements!

I was hoping for a portable solution that I bring with me for shorter jobs working at a coffee shop or while traveling.

So in your case, you use your Windows setup in lieu of a cloud solution, correct?

Many thanks!
Yes, sometimes we use our in-house "servers", but often we use the cloud -- just depends on the type of work, type of data, cost, etc. I can remote into the servers and work on them as if working locally. I really do recommend just starting with the cloud services too, though. It may be all you ever need, and it makes getting started quickly, and you can do it from anywhere.

If you just want to do local experimental small-network stuff, check out Apple's documentation too:
https://developer.apple.com/machine-learning/

By the way, does anyone happen to know a discussion-place like MacRumors for all-things-AI?
 

davidf18

macrumors regular
Original poster
Sep 27, 2013
189
104
I'm an experienced deep learning person, thank you, and I'm familiar with the cloud services. I wanted to run something locally.

If you have any ideas about how to run on a locally, I'd appreciated it.

Thanks!
 

jerryk

macrumors 604
Nov 3, 2011
7,420
4,207
SF Bay Area
I'm an experienced deep learning person, thank you, and I'm familiar with the cloud services. I wanted to run something locally.

If you have any ideas about how to run on a locally, I'd appreciated it.

Thanks!


My suggestion is to get a windows/linux system instead of a Mac for that. With a Mac and the need for Nvidia GPUs you are fighting Apple since they and Nvidia had a failing out after the 2011 GPU debacle.

I did see that some eGPU support the 10 series Nvidia card, with limitations.
https://egpu.io/forums/mac-setup/nvidia-rtx-2070-support-on-15-macbook-pro/

No idea if CUDA Toolkit or CUDADnn drivers work with this set up.
 
Last edited:

poorcody

macrumors 65816
Jul 23, 2013
1,330
1,575
Nvidia has had a WebDriver for Mojave since last year, it's just that Apple won't sign it. The hate seems to run pretty deep there on Apple's part, so I would not expect it to happen.

I think the only practical way you are going to do "native" ML on a Mac is with Apple's tools that can use the AMD GPU. Or, just skip the GPU. Mac versions of TensorFlow, PyTourch etc. run in CPU-only mode. The performance gain from a laptop-GPU is small compared to what you gain going to a server of some sort anyway.

But a Windows laptop may be the only way to get what you want. Guess I don't understand your needs exactly. You want to take an eGPU to Starbucks?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.