Google could do better
-
I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!
-
I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!
You are not alone. I had to use Google Colab for specific piece of work and we faced the same thing. Not just it, while looking for the solve, figured there were many more facing similar flavours of it: [Session crashes, then restarts and then crashes again for an unknown reason · Issue #513 · googlecolab/colabtools · GitHub](https://github.com/googlecolab/colabtools/issues/513) . I see people are still facing and no one is responding there. :doh: Mostly, it turns towards memory and GPU not selected. Luckily for us, this work had paid option to try and we used GPU that worked okay. Totally agree, this is not upto expectation with Google. :thumbsdown:
Latest CodeProject post: Data Visualization - Insights with Matplotlib To read all my blog posts, visit: Learn by Insight...
-
I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!
-
I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!
MehreenTahir wrote:
How can I utilize 12GB RAM over 8MB dataset?
Try asking Google ?
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
-
I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!
Maybe they borrowed Microsoft's QA department? X|
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
Anonymous posting isn't allowed. I signed in already, elephant off. Original post lost.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.