Why model file size is 16gb?
#3
by
deewalia20
- opened
It is a fine tuned version of phi-3 mini and 3.8B model parameters. Please check the model file.
Hi, yes the model is a fine-tuned version of phi-3-mini. We performed fine-tuning with 32 bit precision, but you can load the model in 16 bit mode as well:
model = AutoModelForCausalLM.from_pretrained("numind/NuExtract", torch_dtype=torch.bfloat16, trust_remote_code=True)
In fact, we recommend you do this as it leads to negligible performance loss :)
Thanks, makes sense now.
deewalia20
changed discussion status to
closed