Casual-Autopsy commited on
Commit
7f24d40
1 Parent(s): 77abfd2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -4
README.md CHANGED
@@ -1,8 +1,4 @@
1
  ---
2
- base_model:
3
- - Casual-Autopsy/Gemma-Rad-Uncen
4
- - Casual-Autopsy/Gemma-Rad-RP
5
- - Casual-Autopsy/Gemma-Rad-IQ
6
  library_name: transformers
7
  tags:
8
  - mergekit
@@ -37,6 +33,67 @@ The following models were included in the merge:
37
 
38
  The following YAML configuration was used to produce this model:
39
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  ```yaml
41
  models:
42
  - model: Casual-Autopsy/Gemma-Rad-RP
 
1
  ---
 
 
 
 
2
  library_name: transformers
3
  tags:
4
  - mergekit
 
33
 
34
  The following YAML configuration was used to produce this model:
35
 
36
+ ```yaml
37
+ models:
38
+ - model: crestf411/gemma2-9B-sunfall-v0.5.2
39
+ - model: crestf411/gemma2-9B-daybreak-v0.5
40
+ parameters:
41
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
42
+ weight: [0.5, 0.13, 0.5, 0.13, 0.3]
43
+ - model: crestf411/gemstone-9b
44
+ parameters:
45
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
46
+ weight: [0.13, 0.5, 0.13, 0.5, 0.13]
47
+ merge_method: dare_ties
48
+ base_model: crestf411/gemma2-9B-sunfall-v0.5.2
49
+ parameters:
50
+ normalize: false
51
+ int8_mask: true
52
+ dtype: bfloat16
53
+ ```
54
+
55
+ ```yaml
56
+ models:
57
+ - model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
58
+ - model: nldemo/Gemma-9B-Summarizer-QLoRA
59
+ parameters:
60
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
61
+ weight: [0.0625, 0.25, 0.0625, 0.25, 0.0625]
62
+ - model: SillyTilly/google-gemma-2-9b-it+rbojja/gemma2-9b-intent-lora-adapter
63
+ parameters:
64
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
65
+ weight: [0.0625, 0.25, 0.0625, 0.25, 0.0625]
66
+ - model: nbeerbower/gemma2-gutenberg-9B
67
+ parameters:
68
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
69
+ weight: [0.25, 0.0625, 0.25, 0.0625, 0.25]
70
+ merge_method: ties
71
+ base_model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
72
+ parameters:
73
+ normalize: false
74
+ int8_mask: true
75
+ dtype: bfloat16
76
+ ```
77
+
78
+ ```yaml
79
+ models:
80
+ - model: IlyaGusev/gemma-2-9b-it-abliterated
81
+ - model: TheDrummer/Smegmma-9B-v1
82
+ parameters:
83
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
84
+ weight: [0.5, 0.13, 0.5, 0.13, 0.3]
85
+ - model: TheDrummer/Tiger-Gemma-9B-v1
86
+ parameters:
87
+ density: [0.7, 0.5, 0.3, 0.35, 0.65, 0.35, 0.75, 0.25, 0.75, 0.35, 0.65, 0.35, 0.3, 0.5, 0.7]
88
+ weight: [0.13, 0.5, 0.13, 0.5, 0.13]
89
+ merge_method: dare_ties
90
+ base_model: IlyaGusev/gemma-2-9b-it-abliterated
91
+ parameters:
92
+ normalize: false
93
+ int8_mask: true
94
+ dtype: bfloat16
95
+ ```
96
+
97
  ```yaml
98
  models:
99
  - model: Casual-Autopsy/Gemma-Rad-RP