'How to Use GPU RAM in Google Colab efficiently?
I am designing a multi_label_image_Classifier. For this i need to load train_images of around 7867 nos. While I am loading the images the RAM usage increases from 0.92 to 12.5 GB.
After Loading when I am fitting the images into a numpy array RAM uses total available size i.e. 25.54 GB and the code stop executing with an error "your session crashed".
Sample code which I am using
train_images= []
for i in tqdm(range(train.shape[0)):
img = image.load_img(
'/content/Multi_Label_dataset/Images/'+train['Id'][i]+'.jpg',
target_size=(400,400,3)
)
img = image.img_to_array(img)
img = img/255
train_image.append(img)
Upto the above RAM usage was 12.52 GB
X= np.array(train_image)
While Executing this line, RAM usage becomes red and "Session Crashed Message" popped up.
How to handle this???
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
