Dear Apple GPU designers...
Please make training my neural network cost less $ and go faster.
For my Convolutional Neural Network (CNN) model with 5 convolutional blocks and 3 dense layers,
training the model once (60 epochs) takes:
45 minutes - via Google Colab connected to a NVidia A100 GPU
2 days - via Google Colab connected to a Google Cloud 'free' CPU.
I've spent $110 in Google Colab Plus fees so far. My monthly maximum was used up in what seemed like a day.
Current MBP: a 64 GB, 8 TB, 2021 M1 Max. aka "Maxine"
Just now, saw 58 of 64 GB in use via Activity Monitor.
This is is the most most I've noticed ever using. I only checked because overall, Maxine 'felt' slower.
Even though I am training a model with a ton of parameters, am using Google Colab with Tensorflow, so I wouldn't expect so much local memory to be used. (right?)
Please make training my neural network cost less $ and go faster.
For my Convolutional Neural Network (CNN) model with 5 convolutional blocks and 3 dense layers,
training the model once (60 epochs) takes:
45 minutes - via Google Colab connected to a NVidia A100 GPU
2 days - via Google Colab connected to a Google Cloud 'free' CPU.
I've spent $110 in Google Colab Plus fees so far. My monthly maximum was used up in what seemed like a day.
Current MBP: a 64 GB, 8 TB, 2021 M1 Max. aka "Maxine"
Just now, saw 58 of 64 GB in use via Activity Monitor.
This is is the most most I've noticed ever using. I only checked because overall, Maxine 'felt' slower.
Even though I am training a model with a ton of parameters, am using Google Colab with Tensorflow, so I wouldn't expect so much local memory to be used. (right?)