'Colab RAM is Almost Full After Training Although I Delete The Variables
recently I am using Google Colab GPU for training a model. after the training, I delete the large variables that I have used for the training, but I notice that the ram is still full. I wonder what is really happening and what is exactly in the ram and how can I free up the ram without restarting?
Solution 1:[1]
I don't think variable itself will use much memory.
If you have the choice, you can try colab pro. Then you will have the option to access the high-memory VMs (see here). The price is $9.99/month.
One (maybe not very good) is to use a smaller dataset. Note that you have much more disk space than RAM. What I did was split the original dataset into several smaller sets, then trained the neural network with these dataset.
Another way is to check your code, find if there are some variables storing large array but you just use them once. Then you can set these variables to be zero or empty lists. It's like:
a = np.load("a_very_large_array.npy")
foo(a) # use array a only here
a = []
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Heran |
