Wiki C4 PIQA ARC-E ARC-C HellaSwag Wino Avg.
Unquantized 6.1 9.2 79.9 80.1 50.4 60.2 72.8 68.6
W4G64 6.11 9.38 79.33 79.79 49.74 59.22 73.95 68.41
W3G64 7.13 11.06 78.78 76.22 44.37 56.69 70.32 65.28

Revisions available in this repository:

  • main (W4G64, scales learned);
  • nfl_w3g64 (W3G64, scales learned);
  • nf_w4g64 (W4G64, scales not learned);
  • nf_w3g64 (W3G64, scales not learned);

Evaluations are provided for models with learned scales.
Benchmark scores (zero-shot) are computed with lm-evaluation-harness.

Downloads last month
27
Safetensors
Model size
2.9B params
Tensor type
FP16
·
F32
·
I16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including radi-cho/Meta-Llama-3-8B-FLUTE