KeyError: 'mistral'

#23
by bdambrosio - opened

running python 3.11, transformers 4.34.0.dev0.

I know this must be something dumb, but what?
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
Traceback (most recent call last):
File "", line 1, in
File "/home/bruce/miniconda3/envs/gptq/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 527, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/bruce/miniconda3/envs/gptq/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1041, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/bruce/miniconda3/envs/gptq/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 734, in getitem
raise KeyError(key)
KeyError: 'mistral'

perhaps 'pip install git+https://github.com/huggingface/transformers' ?

good thought, but no luck.

Can you try pip install git+https://github.com/huggingface/transformers.git -U?

that didn't change anything,
Tried creating a clean conda env and doing only your suggested pip install.
Worked! My default conda env has way too much junk from kaggle comps...

Thank you for your support!

bdambrosio changed discussion status to closed

Sign up or log in to comment