DavidAU commited on
Commit
410d126
1 Parent(s): c3e1e11

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -24,7 +24,7 @@ tags:
24
  - 32 bit upscale
25
  ---
26
 
27
- <font color=red><h3> ED2: Ultra Quality High Remaster of the incredible: Psyonic-Cetacean-20b + Mythomax 13B MERGED to 29 Billion parameters. </h3></font>
28
 
29
  This is a Floating Point 32 upscale, where all components and merges were remastered to floating point 32.
30
  This includes all the merges (recreated with master files), and where possible subbing full FP32 models.
@@ -45,7 +45,7 @@ And decimal points are critical to model performance.
45
 
46
  SMALL?
47
 
48
- Yes... but multiplied by each merge(s), and compression(s): 29 billion times.
49
 
50
  <B>PROSE CRAZY:</B>
51
 
@@ -64,7 +64,7 @@ However the "Neo" version of this model is still creatively out there, and tends
64
  and rave with sometimes a "normal" measure and sometime well... extreme. You can see this in the examples.
65
 
66
  This model is a merge between the Ultra Quality Psyonic-Cetacean 20B with the 13B Mythomax model
67
- which ends up at 29 Billion parameters at 92 layers (837 Tensors @ F32).
68
 
69
  For reference a 70B model is typically 120 layers, and Command-R 01 35B is 40 layers (but very dense layers).
70
 
 
24
  - 32 bit upscale
25
  ---
26
 
27
+ <font color=red><h3> ED2: Ultra Quality High Remaster of the incredible: Psyonic-Cetacean-20b + Mythomax 13B MERGED to 29.5 Billion parameters. </h3></font>
28
 
29
  This is a Floating Point 32 upscale, where all components and merges were remastered to floating point 32.
30
  This includes all the merges (recreated with master files), and where possible subbing full FP32 models.
 
45
 
46
  SMALL?
47
 
48
+ Yes... but multiplied by each merge(s), and compression(s): 29.5 billion times.
49
 
50
  <B>PROSE CRAZY:</B>
51
 
 
64
  and rave with sometimes a "normal" measure and sometime well... extreme. You can see this in the examples.
65
 
66
  This model is a merge between the Ultra Quality Psyonic-Cetacean 20B with the 13B Mythomax model
67
+ which ends up at 29.5 Billion parameters at 92 layers (837 Tensors @ F32).
68
 
69
  For reference a 70B model is typically 120 layers, and Command-R 01 35B is 40 layers (but very dense layers).
70