16 bit model release?

#3
by Kernel - opened

Can you convert please to fp16?

When you download the model you can specify the dtype by doing

from transformers import AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("nomic-ai/gpt4all-13b-snoozy", torch_dtype=torch.half)
zpn changed discussion status to closed

Sign up or log in to comment