Edit model card

GPTQ quantized falcon-rw-1b

Branch Bits GS Act Order Damp % GPTQ Dataset Seq Len Size ExLlama Desc
main 2048 -- No 8-bit, without Act Order and no grouop size.
Downloads last month
2
Safetensors
Model size
407M params
Tensor type
I32
·
FP16
·
Inference API
This model can be loaded on Inference API (serverless).