File size: 2,862 Bytes
826621f
 
 
 
 
 
e60bf89
826621f
7075d04
826621f
 
 
 
 
 
e3a26db
e2aae0c
826621f
 
 
7075d04
 
 
826621f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e60bf89
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
---
# Evolutionary model merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using NeuralBeagle14-7B as a base.

### Models Merged

The following models were included in the merge:
* Starling-LM-7B-beta_581094980
* Mistral-7B-v0.1-flashback-v2-instruct_3664132380
* NeuralBeagle14-7B_2368216670

### Configuration

The following YAML configuration was used to produce this model:

```yaml
base_model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  int8_mask: 1.0
  normalize: 0.0
slices:
- sources:
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
    parameters:
      weight: 0.6116678110210994
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      weight: -0.24959657782037278
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2-instruct_3664132380
    parameters:
      weight: 0.540324494683666
- sources:
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
    parameters:
      weight: 0.3293682339424332
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      weight: -0.023694567670847724
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2-instruct_3664132380
    parameters:
      weight: -0.1930115458123503
- sources:
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
    parameters:
      weight: 0.27340593188424295
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      weight: 0.08277665681111157
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2-instruct_3664132380
    parameters:
      weight: -0.04650853736971121
- sources:
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
    parameters:
      weight: 0.22175238436196998
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      weight: 0.3692597806977656
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2-instruct_3664132380
    parameters:
      weight: 0.5617035813353589
```