File size: 1,177 Bytes
514b83a 8be4604 514b83a 012f744 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
---
license: apache-2.0
tags:
- merge
---
Slerp merge of mindy-labs/mindy-7b-v2 with jondurbin/bagel-dpo-7b-v0.1. This model was then slerp merged with rishiraj/CatPPT.
Heard some talk of jondurbin/bagel-dpo-7b-v0.1 in the community and it sounds intresting. Merged it with two high preforming models to get cookinai/Valkyrie-V1
Slerp 1:
```.yaml:
slices:
- sources:
- model: jondurbin/bagel-dpo-7b-v0.1
layer_range: [0, 32]
- model: mindy-labs/mindy-7b-v2
layer_range: [0, 32]
merge_method: slerp
base_model: mindy-labs/mindy-7b-v2
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
Slerp 2:
```.yaml:
slices:
- sources:
- model: previous/model/path
layer_range: [0, 32]
- model: rishiraj/CatPPT
layer_range: [0, 32]
merge_method: slerp
base_model: previous/model/path
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
``` |