icefog72 commited on
Commit
380f842
1 Parent(s): e99c53e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -4,9 +4,13 @@ library_name: transformers
4
  tags:
5
  - mergekit
6
  - merge
7
-
 
 
 
 
8
  ---
9
- # WestIceLemonTeaRP
10
 
11
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
 
@@ -18,8 +22,8 @@ This model was merged using the SLERP merge method.
18
  ### Models Merged
19
 
20
  The following models were included in the merge:
21
- * G:\FModels\WestWizardIceLemonTeaRP
22
- * H:\FModels\IceLemonTeaRP-32k-7b
23
 
24
  ### Configuration
25
 
@@ -29,12 +33,12 @@ The following YAML configuration was used to produce this model:
29
 
30
  slices:
31
  - sources:
32
- - model: H:\FModels\IceLemonTeaRP-32k-7b
33
  layer_range: [0, 32]
34
- - model: G:\FModels\WestWizardIceLemonTeaRP
35
  layer_range: [0, 32]
36
  merge_method: slerp
37
- base_model: H:\FModels\IceLemonTeaRP-32k-7b
38
  parameters:
39
  t:
40
  - filter: self_attn
 
4
  tags:
5
  - mergekit
6
  - merge
7
+ - alpaca
8
+ - mistral
9
+ - not-for-all-audiences
10
+ - nsfw
11
+ license: cc-by-nc-4.0
12
  ---
13
+ # WestIceLemonTeaRP-32k-8bpw
14
 
15
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
16
 
 
22
  ### Models Merged
23
 
24
  The following models were included in the merge:
25
+ * WestWizardIceLemonTeaRP
26
+ * IceLemonTeaRP-32k-7b
27
 
28
  ### Configuration
29
 
 
33
 
34
  slices:
35
  - sources:
36
+ - model: IceLemonTeaRP-32k-7b
37
  layer_range: [0, 32]
38
+ - model: WestWizardIceLemonTeaRP
39
  layer_range: [0, 32]
40
  merge_method: slerp
41
+ base_model: IceLemonTeaRP-32k-7b
42
  parameters:
43
  t:
44
  - filter: self_attn