Error when trying to load into RAM

#1
by AS1200 - opened

I get the error "AttributeError llamacppmodel object has attribute model"

This error occurs when using llama cpp and ctransformers

In the model description there is a link to the latest version of cpp, but I don’t understand how to install it in the oobabooga interface. Help me please. I really want to test qwen on my local computer.

I'm a noob and hope to get clear instructions.

Working on that.
image.png
Building new whls, while you can manually install the newest version with https://github.com/CausalLM/llama-cpp-python
cuBLAS for example: CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install git+https://github.com/CausalLM/llama-cpp-python
Sorry, should wait for whl build, different pkg names.

updated, please follow the new readme

JosephusCheung changed discussion status to closed

Sign up or log in to comment