'Can't load a transformer model from huggingface

I was trying to load a transformers model from huggingface to my local jupyter notebook and here's the code

from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")

Then it went wrong, and the tracebacks are

OSError: Can't load config for 'bert-base-cased'. 
If you were trying to load it from 'https://huggingface.co/models', 
make sure you don't have a local directory with the same name. 
Otherwise, make sure 'bert-base-cased' is the correct path to a 
directory containing a config.json file 

I'm pretty sure there's no local directory with the same name, how to solve this problem? Or is there any other way to load the model?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source