Monero commited on
Commit
df38374
1 Parent(s): 9a01e3a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -2
README.md CHANGED
@@ -8,7 +8,7 @@ tags:
8
  pipeline_tag: text-generation
9
  inference: false
10
  ---
11
- <h1 style="text-align: center">Metharme 7B4 bit</h1>
12
  <h2 style="text-align: center">An instruction-tuned LLaMA biased towards fiction writing and conversation.</h2>
13
 
14
  ## Model Details
@@ -80,4 +80,15 @@ Same process applies. Usually, it is best to do a sliding window over the user a
80
 
81
  The intended use-case for this model is fictional writing for entertainment purposes. Any other sort of usage is out of scope.
82
 
83
- As such, it was **not** fine-tuned to be safe and harmless: the base model _and_ this fine-tune have been trained on data known to contain profanity and texts that are lewd or otherwise offensive. It may produce socially unacceptable or undesirable text, even if the prompt itself does not include anything explicitly offensive. Outputs might often be factually wrong or misleading.
 
 
 
 
 
 
 
 
 
 
 
 
8
  pipeline_tag: text-generation
9
  inference: false
10
  ---
11
+ <h1 style="text-align: center">Metharme 7B 4 bit</h1>
12
  <h2 style="text-align: center">An instruction-tuned LLaMA biased towards fiction writing and conversation.</h2>
13
 
14
  ## Model Details
 
80
 
81
  The intended use-case for this model is fictional writing for entertainment purposes. Any other sort of usage is out of scope.
82
 
83
+ As such, it was **not** fine-tuned to be safe and harmless: the base model _and_ this fine-tune have been trained on data known to contain profanity and texts that are lewd or otherwise offensive. It may produce socially unacceptable or undesirable text, even if the prompt itself does not include anything explicitly offensive. Outputs might often be factually wrong or misleading.
84
+
85
+
86
+ <p><strong><font size="5">Benchmarks</font></strong></p>
87
+
88
+ <p><strong><font size="4">--true-sequential --groupsize 32</font></strong></p>
89
+
90
+ <strong>Wikitext2</strong>: 6.424218654632568
91
+
92
+ <strong>Ptb-New</strong>: 48.48588943481445
93
+
94
+ <strong>C4-New</strong>: 8.089512825012207