Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Google could do better

Google could do better

Scheduled Pinned Locked Moved The Lounge
questionai-modelslearning
6 Posts 6 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    MehreenTahir
    wrote on last edited by
    #1

    I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!

    S L B D 4 Replies Last reply
    0
    • M MehreenTahir

      I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!

      S Offline
      S Offline
      Sandeep Mewara
      wrote on last edited by
      #2

      You are not alone. I had to use Google Colab for specific piece of work and we faced the same thing. Not just it, while looking for the solve, figured there were many more facing similar flavours of it: [Session crashes, then restarts and then crashes again for an unknown reason · Issue #513 · googlecolab/colabtools · GitHub](https://github.com/googlecolab/colabtools/issues/513) . I see people are still facing and no one is responding there. :doh: Mostly, it turns towards memory and GPU not selected. Luckily for us, this work had paid option to try and we used GPU that worked okay. Totally agree, this is not upto expectation with Google. :thumbsdown:

      Latest CodeProject post: Data Visualization - Insights with Matplotlib To read all my blog posts, visit: Learn by Insight...

      1 Reply Last reply
      0
      • M MehreenTahir

        I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #3

        Anonymous posting isn't allowed. I signed in already, elephant off. Original post lost.

        Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.

        enhzflepE 1 Reply Last reply
        0
        • M MehreenTahir

          I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!

          B Offline
          B Offline
          BillWoodruff
          wrote on last edited by
          #4

          MehreenTahir wrote:

          How can I utilize 12GB RAM over 8MB dataset?

          Try asking Google ?

          «One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali

          1 Reply Last reply
          0
          • M MehreenTahir

            I usually work with all the ML related stuff i.e. model training etc. on my local system but just this once I decided to go with Google Colab and I'm disappointed. I'm only trying to train a simple MCNN with rather very small dataset (only 400 images) and Colab session keeping on crashing. Once session crashes, you're back to square one and have to run the whole notebook again. Seriously Google? :| Of course I'm working with free tier but it would be nice if they had some other way to warn us about the full utilization of resources. May be have some hard restriction that RAM utilization cannot exceed more than 10GB instead of a crash? Or may be have a better log system in place? Even the logs are super unhelpful. I tried working with even smaller dataset consisting of only 50 images (8MB) and the session still crashed? How can I utilize 12GB RAM over 8MB dataset? :confused: So not cool Google!

            D Offline
            D Offline
            Daniel Pfeffer
            wrote on last edited by
            #5

            Maybe they borrowed Microsoft's QA department? X|

            Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

            1 Reply Last reply
            0
            • L Lost User

              Anonymous posting isn't allowed. I signed in already, elephant off. Original post lost.

              Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.

              enhzflepE Offline
              enhzflepE Offline
              enhzflep
              wrote on last edited by
              #6

              :laugh: "CodeProject could do even morer betterer" ?

              1 Reply Last reply
              0
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups