Import Error when trying to quantize and get any of the gemma models working on my local machine

#63
by Prajwalll - opened

I'm getting the following error when trying to get the gemma model working on my local machine:

"ImportError: Using bitsandbytes 8-bit quantization requires Accelerate: pip install accelerate and the latest version of bitsandbytes: pip install -i https://pypi.org/simple/ bitsandbytes"

It says that the Accelerate package is missing but I can see the Accelerate, bitsandbytes and the Transformers package already present on my machine when I run pip list. The transformers package is also the latest version transformers==4.38.1.
I will attach the code and the output from the pip list command below

image.png

image.png

image.png

hi @Prajwalll
Hmm that's weird, there might be some weird conflicts in your env .. can you try on a fresh new environment?
Also do you have access to a GPU ?

Hey @ybelkada , I was just trying with a fresh environment and installed only the following latest libraries ''' transformers==4.38.1, accelerate==0.27.2, bitsandbytes==0.42.0''' but I'm still getting the same error. I will also attach the output of the pip list command below:

image.png

I currently have a rtx 2060 on my machine

hmm interesting, can you try to import bitsandbytes on a python console and see its output? I suspect your bnb installation might be corrupted. Can you try to install bitsandbytes from source following the installation guidelines for windows machines: https://huggingface.co/docs/bitsandbytes/main/en/installation?OS+system=Windows ?

I tried running the scripts below on my laptop which has a rtx 3060 but I'm still getting the same error

Requirements file:

image.png

pip list:

image.png

chat.py file:

image.png

Output when I run the chat.py file:

image.png

Do you think there's something wrong with my hugging face account or the access token associated with this account?

Hi @Prajwalll
I think your bnb installation is broken and did not properly detected your GPU, leading to bnb being imported but not used in transformers because it didn't detected your GPU device.
Can you try to follow the instructions I shared above for installing bnb from source on windows and make sure it does not error when you just import bitsandbytes on a python console?

Sign up or log in to comment