Transformers version issue
#4
by
wututuang
- opened
transformers version 4.37.2
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('Alibaba-NLP/gte-Qwen1.5-7B-instruct')
model = AutoModel.from_pretrained('Alibaba-NLP/gte-Qwen1.5-7B-instruct', trust_remote_code=True)
ModuleNotFoundError: No module named 'transformers_modules.Alibaba-NLP.gte-Qwen1'
In version 4.37.2 of the transformers library, there is an issue with loading custom models whose names include a ".", we recommend upgrading to version >=4.39.2.
zyznull
changed discussion status to
closed
I have the same error when using "encode_multi_process()"transformers
version 4.39.2 and 4.41.0
My code:
pool = embedding_model.start_multi_process_pool()
embeddings = embedding_model.encode_multi_process(docs, pool)
embedding_model.stop_multi_process_pool(pool)
Error:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/dinghao/anaconda3/envs/taxonomy/lib/python3.10/multiprocessing/spawn.py", line 116, in spawn_main
exitcode = _main(fd, parent_sentinel)
File "/home/dinghao/anaconda3/envs/taxonomy/lib/python3.10/multiprocessing/spawn.py", line 126, in _main
self = reduction.pickle.load(from_parent)
ModuleNotFoundError: No module named 'transformers_modules.Alibaba-NLP.gte-Qwen1'
remove "." in model_name when use mp