NoSuchFile: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from onnx/bert-base-cased/model.onnx failed:Load model onnx/bert-base-cased/model.onnx

See original GitHub issue

Based on SO post.

Kernel: conda_pytorch_p36. I performed Restart & Run All, and refreshed file view in working directory.

I’m following along with this code tutorial, the first Python code module.

Update: I need to have a transformers.onnx file. Where can I download this?


pip install transformers
pip install onnxruntime
pip install onnx

Code:

import onnxruntime as ort

from transformers import BertTokenizerFast

tokenizer = BertTokenizerFast.from_pretrained("bert-base-cased")

ort_session = ort.InferenceSession("onnx/bert-base-cased/model.onnx")

inputs = tokenizer("Using BERT in ONNX!", return_tensors="np")
outputs = ort_session.run(["last_hidden_state", "pooler_output"], dict(inputs))

Traceback:

---------------------------------------------------------------------------
NoSuchFile                                Traceback (most recent call last)
<ipython-input-5-6c2daa1cf144> in <module>
      5 tokenizer = BertTokenizerFast.from_pretrained("bert-base-cased")
      6 
----> 7 ort_session = ort.InferenceSession("onnx/bert-base-cased/model.onnx")
      8 
      9 inputs = tokenizer("Using BERT in ONNX!", return_tensors="np")

~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py in __init__(self, path_or_bytes, sess_options, providers, provider_options)
    278 
    279         try:
--> 280             self._create_inference_session(providers, provider_options)
    281         except RuntimeError:
    282             if self._enable_fallback:

~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py in _create_inference_session(self, providers, provider_options)
    305         session_options = self._sess_options if self._sess_options else C.get_default_session_options()
    306         if self._model_path:
--> 307             sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
    308         else:
    309             sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)

NoSuchFile: [ONNXRuntimeError] : 3 : NO_SUCHFILE : Load model from onnx/bert-base-cased/model.onnx failed:Load model onnx/bert-base-cased/model.onnx failed. File doesn't exist

Please let me know if there’s anything else I can add to post.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
jcwchencommented, Jan 12, 2022

Sorry I am not a transformer expert so I am not sure where the problem is. I would suggest you raise this question under transformers repo to get the best help from them. It seems that transformers was not installed successfully because it cannot find PreTrainedModel from transformers somehow. Perhaps you can try to remove your existing one and reinstall it.

0reactions
danielbellhvcommented, Jan 20, 2022

Both installations are now coming up with [Errno 28] No space left on device. This seems to be it’s own issue, regarding the storage on AWS SageMaker instance.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Error on running Super Resolution Model from ONNX
super-resolution-10.onnx seems to load OK for me. I downloaded the file from ...
Read more >
Exporting transformers models - Hugging Face
Starting from transformers v2.10.0 we partnered with ONNX Runtime to provide an easy export of transformers models to the ONNX format.
Read more >
Error when loading in Python an .onnx neural net exported via ...
I can't use in Python an .onnx neural net exported with Matlab. Let say I want to use the googlenet model, the code...
Read more >
Image Background Removal with U^2-Net and OpenVINO
The PyTorch U2-Net model is converted to ONNX and loaded with OpenVINO. The model source is here. For a more detailed overview of...
Read more >
ONNX Tutorial relies on outdated model - Apache TVM Discuss
The from_onnx tutorial hosted here: Compile ONNX Models — tvm ... Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found