mixtral megamerge 8x7b v1

The following models were merged with DARE using https://github.com/martyn/safetensors-merge-supermario

Mergelist

cognitivecomputations/dolphin-2.6-mixtral-8x7b
mistralai/Mixtral-8x7B-v0.1
mistralai/Mixtral-8x7B-Instruct-v0.1

Merge command

python hf_merge.py mergelist.txt mixtral-merge-1 -p 0.1 -lambda 1.95

Notes

  • seems to generalize instruct styles
  • MoE gates are not modified
Downloads last month
998
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for martyn/mixtral-megamerge-dare-8x7b-v1

Quantizations
1 model

Collection including martyn/mixtral-megamerge-dare-8x7b-v1