zaq-hack commited on
Commit
a2cb095
1 Parent(s): 31345f9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -0
README.md CHANGED
@@ -1,7 +1,16 @@
1
  ---
2
  license: cc-by-nc-4.0
3
  ---
 
 
 
 
 
4
 
 
 
 
 
5
 
6
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/630dfb008df86f1e5becadc3/vwcJfOnL-2QDJ0ShfxRJ5.png)
7
 
 
1
  ---
2
  license: cc-by-nc-4.0
3
  ---
4
+ I recently ran into this [quantized model of Noromaid](https://huggingface.co/rAIfle/NoroMaid-v0.4-Mixtral-Instruct-8x7b-Zloss-exl2-rpcal) with the "rpcal" term on the end.</br>
5
+ The secret sauce used an [RP data set](https://huggingface.co/datasets/royallab/PIPPA-cleaned) instead of the standard llama dataset to do the quantizing.</br>
6
+ On a test drive, it seemed to make an enormous difference in the quality of the output, so I'm trying to quantify if it does make a significant difference.</br>
7
+ I plan to run a variety of things like [The Sarah Test](https://rentry.org/thesarahtest) to see if I can see any objective differences.</br>
8
+ My favorite model for the past few weeks has been really enjoyable, so I want to see how it compares to this one.</br>
9
 
10
+ Favorite: [zaq-hack/Noromaid-v0.4-Mixtral-Instruct-8x7b-Zloss-bpw300-h6-exl2](https://huggingface.co/zaq-hack/Noromaid-v0.4-Mixtral-Instruct-8x7b-Zloss-bpw300-h6-exl2)
11
+ This model: EXL2 @ 3.0 bpw using the RP samples instead of standard calibration.
12
+
13
+ I'm just tinkering. All credit to the original creators: Noromaid is hot.
14
 
15
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/630dfb008df86f1e5becadc3/vwcJfOnL-2QDJ0ShfxRJ5.png)
16