File size: 654 Bytes
b17a63f 960725d b17a63f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
inference: false
tags:
- dare
- super mario merge
- pytorch
- mixtral
- merge
---
# mixtral megamerge 8x7b v1
The following models were merged with DARE using [https://github.com/martyn/safetensors-merge-supermario](https://github.com/martyn/safetensors-merge-supermario)
## Mergelist
```
cognitivecomputations/dolphin-2.6-mixtral-8x7b
mistralai/Mixtral-8x7B-v0.1
mistralai/Mixtral-8x7B-Instruct-v0.1
```
## Merge command
```
python hf_merge.py mergelist.txt mixtral-merge-1 -p 0.1 -lambda 1.95
```
### Notes
* seems to generalize instruct styles
* MoE gates are not modified |