Files changed (1) hide show
  1. README.md +3 -55
README.md CHANGED
@@ -1,68 +1,16 @@
1
  ---
2
- base_model:
3
- - NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
4
- - NeuralNovel/Gecko-7B-v0.1-DPO
5
- tags:
6
- - mergekit
7
- - merge
8
  license: apache-2.0
9
 
10
  datasets:
11
- - Intel/orca_dpo_pairs
12
- - NeuralNovel/Neural-Story-v1
13
-
14
  ---
15
 
16
 
17
  ![tiger](https://cdn-uploads.huggingface.co/production/uploads/645cfe4603fc86c46b3e46d1/a9GqRTNoGZQsRVU-C6XRO.jpeg)
18
 
19
 
20
- # Tiger-7b-v0.1
21
-
22
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
23
-
24
- [Join our Discord!](https://discord.gg/Qge8Ds9C)
25
-
26
- ## Metrics
27
-
28
-
29
-
30
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/645cfe4603fc86c46b3e46d1/Z58bB5sYr3pyE2Ilbk7Dk.png)
31
-
32
-
33
-
34
- ### Merge Method
35
-
36
- This model was merged using the SLERP merge method.
37
-
38
- ### Models Merged
39
-
40
- The following models were included in the merge:
41
- * [NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story](https://huggingface.co/NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story)
42
- * [NeuralNovel/Gecko-7B-v0.1-DPO](https://huggingface.co/NeuralNovel/Gecko-7B-v0.1-DPO)
43
- # merge
44
- ### Configuration
45
-
46
- The following YAML configuration was used to produce this model:
47
-
48
- ```yaml
49
-
50
- slices:
51
- - sources:
52
- - model: NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
53
- layer_range: [0, 32]
54
- - model: NeuralNovel/Gecko-7B-v0.1-DPO
55
- layer_range: [0, 32]
56
- merge_method: slerp
57
- base_model: NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
58
- parameters:
59
- t:
60
- - filter: self_attn
61
- value: [0, 0.5, 0.3, 0.7, 1]
62
- - filter: mlp
63
- value: [1, 0.5, 0.7, 0.3, 0]
64
- - value: 0.5
65
- dtype: bfloat16
66
 
67
 
68
 
 
1
  ---
2
+
 
 
 
 
 
3
  license: apache-2.0
4
 
5
  datasets:
6
+ - NeuralNovel/Neural-DPO
 
 
7
  ---
8
 
9
 
10
  ![tiger](https://cdn-uploads.huggingface.co/production/uploads/645cfe4603fc86c46b3e46d1/a9GqRTNoGZQsRVU-C6XRO.jpeg)
11
 
12
 
13
+ # Tiger-7b-v0.1-Laser-DPO
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
 
15
 
16