ProdeusUnity commited on
Commit
907beca
1 Parent(s): adc7dab

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +63 -63
README.md CHANGED
@@ -1,63 +1,63 @@
1
- ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
7
-
8
- ---
9
- # Stellar Odyssey 12b v0.0
10
- *The sparkling courage I longed for, what I got is small... My tears are surely the prism of tomorrow... Say "Hello!" to the ideal future, let's go see them~*
11
-
12
- Listen to the song on youtube: https://www.youtube.com/watch?v=v3I6EVlyPx4
13
-
14
- One off merge for a friend, though it came out rather good, I like it, so try it?
15
-
16
- mistralai/Mistral-Nemo-Base-2407
17
- inflatebot/MN-12b-Mag-Mell-R1
18
- nbeerbower/Mistral-Nemo-Prism-12B-v5
19
-
20
- License for this model Apache 2.0
21
-
22
-
23
- Format: Mistral Tekken or ChatML
24
-
25
- Thank you to AuriAetherwiing for helping me merge the models and for providing compute (A40).
26
-
27
- Details
28
-
29
-
30
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
31
-
32
- ## Merge Details
33
- ### Merge Method
34
-
35
- This model was merged using the della_linear merge method using C:\Users\lg911\Downloads\Mergekit-Fixed\mergekit\mistralai_Mistral-Nemo-Base-2407 as a base.
36
-
37
- ### Models Merged
38
-
39
- Models Merged
40
- The following models were included in the merge:
41
-
42
- /inflatebot_MN-12B-Mag-Mell-R1
43
- /nbeerbower_Mistral-Nemo-Prism-12B-v5
44
-
45
- #### Configuration
46
- The following YAML configuration was used to produce this model:
47
-
48
- models:
49
- - model: /inflatebot_MN-12B-Mag-Mell-R1
50
- parameters:
51
- weight: 0.3
52
- density: 0.5
53
- - model: /nbeerbower_Mistral-Nemo-Prism-12B-v5
54
- parameters:
55
- weight: 0.4
56
- density: 0.75
57
- base_model: /mistralai_Mistral-Nemo-Base-2407
58
- parameters:
59
- epsilon: 0.05
60
- normalize: true
61
- lambda: 1
62
- merge_method: ties
63
- dtype: bfloat16
 
1
+ ---
2
+ base_model: []
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - merge
7
+
8
+ ---
9
+ # Prismatic 12b v0.0
10
+ *The sparkling courage I longed for, what I got is small... My tears are surely the prism of tomorrow... Say "Hello!" to the ideal future, let's go see them~*
11
+
12
+ Listen to the song on youtube: https://www.youtube.com/watch?v=v3I6EVlyPx4
13
+
14
+ One off merge for a friend, though it came out rather good, I like it, so try it?
15
+
16
+ mistralai/Mistral-Nemo-Base-2407
17
+ inflatebot/MN-12b-Mag-Mell-R1
18
+ nbeerbower/Mistral-Nemo-Prism-12B-v5
19
+
20
+ License for this model Apache 2.0
21
+
22
+
23
+ Format: Mistral Tekken or ChatML
24
+
25
+ Thank you to AuriAetherwiing for helping me merge the models and for providing compute (A40).
26
+
27
+ Details
28
+
29
+
30
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
31
+
32
+ ## Merge Details
33
+ ### Merge Method
34
+
35
+ This model was merged using the della_linear merge method using C:\Users\lg911\Downloads\Mergekit-Fixed\mergekit\mistralai_Mistral-Nemo-Base-2407 as a base.
36
+
37
+ ### Models Merged
38
+
39
+ Models Merged
40
+ The following models were included in the merge:
41
+
42
+ /inflatebot_MN-12B-Mag-Mell-R1
43
+ /nbeerbower_Mistral-Nemo-Prism-12B-v5
44
+
45
+ #### Configuration
46
+ The following YAML configuration was used to produce this model:
47
+
48
+ models:
49
+ - model: /inflatebot_MN-12B-Mag-Mell-R1
50
+ parameters:
51
+ weight: 0.3
52
+ density: 0.5
53
+ - model: /nbeerbower_Mistral-Nemo-Prism-12B-v5
54
+ parameters:
55
+ weight: 0.4
56
+ density: 0.75
57
+ base_model: /mistralai_Mistral-Nemo-Base-2407
58
+ parameters:
59
+ epsilon: 0.05
60
+ normalize: true
61
+ lambda: 1
62
+ merge_method: ties
63
+ dtype: bfloat16