'BertTokenizerFast' object has no attribute '_in_target_context_manager'
See original GitHub issueI am using version 0.11.0.
This is a saved model and imported using BERTopic.load(), it worked fine last week and the before that. I just tried doing some work on it today and I got this error when I try to use, topic_model.find_topic('...') .
'BertTokenizerFast' object has no attribute '_in_target_context_manager'.
Please and Thank you 😃
Issue Analytics
- State:
- Created a year ago
- Comments:8 (2 by maintainers)
Top Results From Across the Web
'BertTokenizerFast' object has no attribute 'get_vocab_size'
python - 'BertTokenizerFast' object has no attribute 'get_vocab_size' - Stack Overflow. Stack Overflow for Teams – Start collaborating and ...
Read more >How to save my tokenizer using save_pretrained? - Beginners
from transformers import BertTokenizerFast new_tokenizer ... Tokenizer' object has no attribute 'save_pretrained'. Am I saving the tokenizer wrong?
Read more >RoBERTa_Bert_tokenizer_train_...
Explore and run machine learning code with Kaggle Notebooks | Using data from multiple data sources.
Read more >How to use BERT from the Hugging Face transformer library
It has many functionalities for any type of tokenization tasks. You can download the tokenizer using this line of code:
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Found the issue, the 4.22.x and 4.21.x versions of transformers is returning the same error.
Using an older version, transformers==4.20.1, it started working fine again.
Thank you! I also tried to use the new version to save and load the model. It worked well!