'Colab crashes when trying to create confusion matrix

I am trying to create a confusion matrix for my test set. My test set consists of 3585 images. Whenever I try to run the following code:

x_test,y_test = next(iter(dataloader)))
y_pred = resnet(x_test)

Google colab crashes using all the available RAM. Does anyone have a work around for this? Should I do this in batches?



Solution 1:[1]

Should I do this in batches?

Yes! Try to reduce batch size.

dataloader = ...  # reduce batch size here on dataloader creation
...

y_pred = []
for batch in dataloader:
    batch_y_pred = resnet(batch)
    y_pred.append(batch_y_pred)

I use list with append, you can try another way.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 trsvchn