'I'm not sure if my roberta model is training or not

So, I'm trying to train a Roberta model with a classifier head and I am not 100% sure if this is still stuck doing embedding or if it just isn't working, it's been like this for a while now. enter image description here

I am training with pytorch-lightning and I have 50k datapoints, with the full size Roberta model. This is the code I'm using to train

trainer = pl.Trainer(max_epochs = config['n_epochs'], gpus = 1, num_sanity_val_steps = 1)
trainer.fit(model, submission_data_module)

And these are my hyperparameters

config = {
    'model_name': 'roberta-base',
    'n_labels': len(labels),
    'bs': 128,
    'lr': 1.5e-6,
    'warmup': 0.2,
    'train_size': len(submission_data_module.train_dataloader()),
    'w_decay': 0.001,
    'n_epochs': 1
}

I set sanity val and epochs to 0 just to see if it's working. Any ideeas?

I just noticed bad visibility in the first screenshot, it is the Validation sanity check: 0% bar followed by 0/1 [00:00<?, ?it/s]



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source