This repo contains several GGUF quants of Hestia-20b.

This is a task_arithmetic merge of Harmonia (my 20b faux base model) with Noromaid and my LORA-glued Nethena. Solidly outperforms Harmonia.

merge_method: task_arithmetic

base_model: athirdpath/Harmonia-20b

models:

  • model: athirdpath/Harmonia-20b

  • model: NeverSleep/Noromaid-20b-v0.1.1

    • parameters: weight: 0.25
  • model: athirdpath/Nethena-20b-Glued

    • parameters: weight: 0.2

dtype: float16

Thanks to Undi95 for pioneering the 20B recipe, and for most of the models involved.

Downloads last month
32
GGUF
Model size
20B params
Architecture
llama

4-bit

5-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .

Collection including athirdpath/Hestia-20b-GGUF