DavidAU commited on
Commit
1d9f1f2
·
verified ·
1 Parent(s): a584d04

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -19,7 +19,7 @@ tags:
19
  - 32 bit upscale
20
  ---
21
 
22
- <font color=red><h3> Ultra Quality High Remaster of the incredible: Psyonic-Cetacean-20b + Mythomax 13B MERGE to 29B </h3></font>
23
 
24
  This is a Floating Point 32 upscale, where all components and merges were remastered to floating point 32.
25
  This includes all the merges (recreated with master files), and where possible subbing full FP32 models.
@@ -53,14 +53,14 @@ This is version 1 of 7 current versions, with sub-versions as well.
53
  This is also a merge between the Ultra Quality Psyonic-Cetacean 20B with the 13B Mythomax model
54
  which ends up at 29 Billion parameters at 90 layers.
55
 
56
- For reference a 70B model is typically 120 layers, and Command-R 01 is 40 layers (but very dense layers).
57
 
58
  These models are a "pass-through" merge, meaning that all the unique qualities of all models is preserved in full,
59
  no overwriting or merging of the parameters, weights and so on.
60
 
61
- Although this model can be used for many purposes, it primary is creative prose.
62
 
63
- Because of the unique merge this model (and/or versions of it) may make the odd "typo" but it can also make up
64
  words on the fly too.
65
 
66
  See prose examples below.
 
19
  - 32 bit upscale
20
  ---
21
 
22
+ <font color=red><h3> Ultra Quality High Remaster of the incredible: Psyonic-Cetacean-20b + Mythomax 13B MERGED to 29 Billion parameters. </h3></font>
23
 
24
  This is a Floating Point 32 upscale, where all components and merges were remastered to floating point 32.
25
  This includes all the merges (recreated with master files), and where possible subbing full FP32 models.
 
53
  This is also a merge between the Ultra Quality Psyonic-Cetacean 20B with the 13B Mythomax model
54
  which ends up at 29 Billion parameters at 90 layers.
55
 
56
+ For reference a 70B model is typically 120 layers, and Command-R 01 35B is 40 layers (but very dense layers).
57
 
58
  These models are a "pass-through" merge, meaning that all the unique qualities of all models is preserved in full,
59
  no overwriting or merging of the parameters, weights and so on.
60
 
61
+ Although this model can be used for many purposes, it is primaryly for creative prose.
62
 
63
+ Because of the unique merge this model (and versions of it) may make the odd "typo" but it can also make up
64
  words on the fly too.
65
 
66
  See prose examples below.