Text Generation
Transformers
Safetensors
gpt_bigcode
code
granite
conversational
Eval Results
Inference Endpoints
text-generation-inference

Is it fine to do granite-20b model's inference with bfloat16 dtype?

#1
by lkm1 - opened

I have tried this model's inference with bfloat16 type and have not faced any problems. Is it fine to use bfloat16 dtype for this model's inference, without much effect on score?

IBM Granite org

yeah, it should be ok @lkm1

IBM Granite org

we have run scores in bf16

ok, thanks @mayank-mishra !

lkm1 changed discussion status to closed

Sign up or log in to comment