'How to cache HuggingFace model and tokenizer

I'm using hugginface model distilbert-base-uncased and tokenizer DistilBertTokenizerFast and I'm loading them currently using .from_pretrained()

I want cache them so that they work without internet was well.

I tried cache_dir parameter in the from_pretrained() but it didn't work.

Any suggestions?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source