OSError: Can't load config for 'bert-base-uncased

See original GitHub issue

Environment info

It happens in local machine, Colab, and my colleagues also.

  • transformers version:
  • Platform: Window, Colab
  • Python version: 3.7
  • PyTorch version (GPU?): 1.8.1 (GPU yes)
  • Tensorflow version (GPU?):
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Who can help

@LysandreJik It is to do with ‘bert-base-uncased’

Information

Hi, I m having this error suddenly this afternoon. It was all okay before for days. It happens in local machine, Colab and also to my colleagues. I can access this file in browser https://huggingface.co/bert-base-uncased/resolve/main/config.json no problem. Btw, I m from Singapore. Any urgent help will be appreciated because I m rushing some project and stuck there.

Thanks

image

403 Client Error: Forbidden for url: https://huggingface.co/bert-base-uncased/resolve/main/config.json

HTTPError Traceback (most recent call last) /usr/local/lib/python3.7/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs) 505 use_auth_token=use_auth_token, –> 506 user_agent=user_agent, 507 )

6 frames HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/bert-base-uncased/resolve/main/config.json During handling of the above exception, another exception occurred:

OSError Traceback (most recent call last) /usr/local/lib/python3.7/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs) 516 f"- or ‘{pretrained_model_name_or_path}’ is the correct path to a directory containing a {CONFIG_NAME} file\n\n" 517 ) –> 518 raise EnvironmentError(msg) 519 520 except json.JSONDecodeError:

OSError: Can’t load config for ‘bert-base-uncased’. Make sure that:

  • ‘bert-base-uncased’ is a correct model identifier listed on ‘https://huggingface.co/models

  • or ‘bert-base-uncased’ is the correct path to a directory containing a config.json file

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:15 (7 by maintainers)

github_iconTop GitHub Comments

6reactions
WinMinTuncommented, Jul 30, 2021

Still not okay online, but I managed to do it locally

git clone https://huggingface.co/bert-base-uncased

#model = AutoModelWithHeads.from_pretrained(“bert-base-uncased”) model = AutoModelWithHeads.from_pretrained(BERT_LOCAL_PATH, local_files_only=True)

#tokenizer = AutoTokenizer.from_pretrained(“bert-base-uncased”) tokenizer = AutoTokenizer.from_pretrained(BERT_LOCAL_PATH, local_files_only=True)

adapter_name = model2.load_adapter(localpath, config=config, model_name=BERT_LOCAL_PATH)

2reactions
jason-weddingtoncommented, Aug 1, 2021

With additional testing, I’ve found that this issue only occurs with adapter-tranformers, the AdapterHub.ml modified version of the transformers module. With the HuggingFace module, we can pull pretrained weights without issue.

Using adapter-transformers this is now working again from Google Colab, but is still failing locally and from servers running in AWS. Interestingly, with adapter-transformers I get a 403 even if I try to load a nonexistent model (e.g. fake-model-that-should-fail). I would expect this to fail with a 401, as there is no corresponding config.json on huggingface.co. The fact that it fails with a 403 seems to indicate that something in front of the web host is rejecting the request before the web host has a change to respond with a not found error.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Python: OSError can't load config for bert - nlp - Stack Overflow
I have all the required files present in my dataset including the config.json bert file but when I run the model it gives...
Read more >
BERT Model - OSError - Beginners - Hugging Face Forums
When running this BERT Model , it outputs OSError. the following is the model “nlptown/bert-base-multilingual-uncased-sentiment” ,.
Read more >
Can't load "bert-base-cased" model from huggingface - Kaggle
I was trying to load a transformers model from huggingface in my local jupyter notebook and here's the ... OSError: Can't load config...
Read more >
huggingface load local model - You.com | The AI Search ...
... is the same: OSError: Can't load weights for 'distilbert-base-uncased' From where can I download this pretrained model so that I can load...
Read more >
BertTokenizer Loading Problem - Data Science Stack Exchange
OSError : Can't load tokenizer for 'bert-large-uncased'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found