load pretrained model error

#6
by Yin-Xie - opened

Traceback (most recent call last):
File "/mnt2/yinxie/code/Yi/VL/single_inference.py", line 110, in
single_infer(args)
File "/mnt2/yinxie/code/Yi/VL/single_inference.py", line 32, in single_infer
tokenizer, model, image_processor, context_len = load_pretrained_model(model_path)
File "/mnt2/yinxie/code/Yi/VL/llava/mm_utils.py", line 78, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
File "/root/miniconda3/envs/llava_xy/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 702, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/root/miniconda3/envs/llava_xy/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1841, in from_pretrained
return cls._from_pretrained(
File "/root/miniconda3/envs/llava_xy/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2060, in _from_pretrained
raise ValueError(
ValueError: Non-consecutive added token '' found. Should have index 64000 but has index 0 in saved vocabulary.

You need to have transformer version 4.34.0 for this.

As it seems like your question has been answered, if there is nothing else we can help you with on this matter, I will be closing this discussion for now.

If you have any further questions, feel free to reopen this discussion or start a new one.

Thank you for your contribution to this community!

richardllin changed discussion status to closed

Sign up or log in to comment