Qwen2.5-0.5B

ExLlamav2 8 bpw quant of https://huggingface.co/Qwen/Qwen2.5-0.5B

Downloads last month
22
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for altomek/Qwen2.5-0.5B-8bpw-EXL2

Base model

Qwen/Qwen2.5-0.5B
Quantized
(48)
this model