Transformers
Safetensors
ColPali
English
pretraining
Inference Endpoints

Transformers package version?

#1
by sonia-rao - opened

Not able to load the model, getting error: ValueError: The checkpoint you are trying to load has model type colpali but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Below are the package versions I have in my venv. Does anyone know which one is wrong?

Package Version


accelerate 1.2.1
aiohappyeyeballs 2.4.4
aiohttp 3.11.11
aiosignal 1.3.2
async-timeout 5.0.1
attrs 24.3.0
certifi 2024.12.14
charset-normalizer 3.4.0
datasets 3.2.0
dill 0.3.8
einops 0.8.0
filelock 3.16.1
frozenlist 1.5.0
fsspec 2024.10.0
huggingface 0.0.1
huggingface-hub 0.27.0
idna 3.10
jinja2 3.1.4
markupsafe 3.0.2
mpmath 1.3.0
multidict 6.1.0
multiprocess 0.70.16
networkx 3.4.2
numpy 2.2.0
nvidia-cublas-cu12 12.1.3.1
nvidia-cuda-cupti-cu12 12.1.105
nvidia-cuda-nvrtc-cu12 12.1.105
nvidia-cuda-runtime-cu12 12.1.105
nvidia-cudnn-cu12 8.9.2.26
nvidia-cufft-cu12 11.0.2.54
nvidia-curand-cu12 10.3.2.106
nvidia-cusolver-cu12 11.4.5.107
nvidia-cusparse-cu12 12.1.0.106
nvidia-nccl-cu12 2.19.3
nvidia-nvjitlink-cu12 12.6.85
nvidia-nvtx-cu12 12.1.105
packaging 24.2
pandas 2.2.3
peft 0.10.0
pillow 11.0.0
propcache 0.2.1
psutil 6.1.0
pyarrow 18.1.0
python-dateutil 2.9.0.post0
pytz 2024.2
pyyaml 6.0.2
regex 2024.11.6
requests 2.32.3
safetensors 0.4.5
setuptools 75.6.0
six 1.17.0
sympy 1.13.3
tokenizers 0.21.0
torch 2.2.0
tqdm 4.67.1
transformers 4.47.1
triton 2.2.0
typing-extensions 4.12.2
tzdata 2024.2
urllib3 2.2.3
xxhash 3.5.0
yarl 1.18.3

nevermind - installed from source.

sonia-rao changed discussion status to closed

Sign up or log in to comment