icefog72 commited on
Commit
6838e68
1 Parent(s): b3f9a2d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -50
README.md CHANGED
@@ -1,50 +1,54 @@
1
- ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
7
-
8
- ---
9
- # IceSakeV6RP-7b
10
-
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
-
13
- ## Merge Details
14
- ### Merge Method
15
-
16
- This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using G:\FModels\Mistroll-7B-v2.2 as a base.
17
-
18
- ### Models Merged
19
-
20
- The following models were included in the merge:
21
- * F:\FModels\IceCoffeeRP-7b
22
- * H:\FModels\kuno-royale-v3-7b
23
- * G:\FModels\IceSakeV4RP-7b
24
-
25
- ### Configuration
26
-
27
- The following YAML configuration was used to produce this model:
28
-
29
- ```yaml
30
- models:
31
- - model: H:\FModels\kuno-royale-v3-7b
32
- parameters:
33
- density: 0.8
34
- weight: 0.5
35
- - model: G:\FModels\IceSakeV4RP-7b
36
- parameters:
37
- density: 0.5
38
- weight: 0.5
39
- - model: F:\FModels\IceCoffeeRP-7b
40
- parameters:
41
- density: 0.5
42
- weight: 0.7
43
- merge_method: ties
44
- base_model: G:\FModels\Mistroll-7B-v2.2
45
- parameters:
46
- normalize: true
47
- int8_mask: true
48
- dtype: float16
49
-
50
- ```
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - merge
7
+ - alpaca
8
+ - mistral
9
+ - not-for-all-audiences
10
+ - nsfw
11
+
12
+ ---
13
+ # IceSakeV6RP-7b
14
+
15
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
16
+
17
+ ## Merge Details
18
+ ### Merge Method
19
+
20
+ This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using G:\FModels\Mistroll-7B-v2.2 as a base.
21
+
22
+ ### Models Merged
23
+
24
+ The following models were included in the merge:
25
+ * F:\FModels\IceCoffeeRP-7b
26
+ * H:\FModels\kuno-royale-v3-7b
27
+ * G:\FModels\IceSakeV4RP-7b
28
+
29
+ ### Configuration
30
+
31
+ The following YAML configuration was used to produce this model:
32
+
33
+ ```yaml
34
+ models:
35
+ - model: H:\FModels\kuno-royale-v3-7b
36
+ parameters:
37
+ density: 0.8
38
+ weight: 0.5
39
+ - model: G:\FModels\IceSakeV4RP-7b
40
+ parameters:
41
+ density: 0.5
42
+ weight: 0.5
43
+ - model: F:\FModels\IceCoffeeRP-7b
44
+ parameters:
45
+ density: 0.5
46
+ weight: 0.7
47
+ merge_method: ties
48
+ base_model: G:\FModels\Mistroll-7B-v2.2
49
+ parameters:
50
+ normalize: true
51
+ int8_mask: true
52
+ dtype: float16
53
+
54
+ ```