how to load this model
#4
by
balu548411
- opened
hey Bloke can you help me with this plz
i used,
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("TheBloke/Vicuna-33B-1-3-SuperHOT-8K-GPTQ")
model = AutoModelForCausalLM.from_pretrained("TheBloke/Vicuna-33B-1-3-SuperHOT-8K-GPTQ", trust_remote_code=True)
i got,
OSError: TheBloke/Vicuna-33B-1-3-SuperHOT-8K-GPTQ does not appear to have a file named pytorch_model.bin,
tf_model.h5, model.ckpt or flax_model.msgpack.
Please see the README file for instructions on how to load it from Python code.
sorry and thank you Bloke
balu548411
changed discussion status to
closed