Could not find GemmaForCausalLM neither in <module 'transformers.models.gemma'

#36
by chenwei1984 - opened

Could not find GemmaForCausalLM neither in <module 'transformers.models.gemma' ????

update your transformers to 4.38 or 4.38.1 (for pytorch 2.1)

osanseviero changed discussion status to closed

Name: transformers
Version: 4.38.1
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: C:\Users\Administrator\AppData\Local\NVIDIA\MiniConda\envs\gcn\Lib\site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm

I have already used all the specified versions, but it still doesn't work

Why doesn't it work no matter how I update it? Is everyone else experiencing the same issue?

Google org

Make sure to do pip install -U transformers and restart your environment. If you're in Colab, it will keep using the older versions if you don't restart.

Sign up or log in to comment