radi-cho's picture
Update README.md
e45771e verified
|
raw
history blame
436 Bytes
metadata
license: gemma

WARNING: gemma-2-27b models don't run well in float16 precision.
This FLUTE-quantized model is released in bfloat16.

Wiki C4
W4G64 5.91 9.71
W3G64 TBD TBD

Evaluations are provided for models with learned scales.
Check the base gemma-2-27b-FLUTE for lm-eval-harness benchmarks.