How to load safetensors model?

#1
by Cetafe - opened

Hi, I am new to DL, just curious how to load the safetensors model? I used "model = LlamaForCausalLM.from_pretrained('E:/vicuna_13b_v1.1_GPTQ_4bit128g_cuda/').cuda() " just like vicuna native and it gives me an error....FileNotFoundError: [Errno 2] No such file or directory:
'E:/vicuna_13b_v1.1_GPTQ_4bit128g_cuda/pytorch_model-00001-of-00003.bin' --> Do I need to modify the config? Thanks a lot!!!

I was able to use this after installing the GPTQ-for-LLaMa, by following the instructions from the oobabooga textgeneration-iu github README: https://github.com/oobabooga/text-generation-webui/wiki/GPTQ-models-(4-bit-mode)

Sign up or log in to comment