runtime error

Exit code: 1. Reason: %|██ | 1/5 [00:09<00:36, 9.09s/it] Loading checkpoint shards: 40%|████ | 2/5 [00:10<00:14, 4.82s/it] Loading checkpoint shards: 80%|████████ | 4/5 [00:12<00:02, 2.23s/it] Loading checkpoint shards: 100%|██████████| 5/5 [00:12<00:00, 2.51s/it] generation_config.json: 0%| | 0.00/184 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 184/184 [00:00<00:00, 1.06MB/s] tokenizer_config.json: 0%| | 0.00/51.3k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 51.3k/51.3k [00:00<00:00, 99.0MB/s] special_tokens_map.json: 0%| | 0.00/439 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 439/439 [00:00<00:00, 3.17MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 84, in <module> tokenizer = AutoTokenizer.from_pretrained(model_id) File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 896, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2291, in from_pretrained return cls._from_pretrained( File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2525, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 134, in __init__ raise ValueError( ValueError: Couldn't instantiate the backend tokenizer from one of: (1) a `tokenizers` library serialization file, (2) a slow tokenizer instance to convert or (3) an equivalent slow tokenizer class to instantiate and convert. You need to have sentencepiece installed to convert a slow tokenizer to a fast one.

Container logs:

Fetching error logs...