at676's picture
Update README.md
ec09b35 verified
|
raw
history blame
1.42 kB

Perplexity (ctx 8192)

BF16:

I1005 10:12:54.337877 967332 eval_ppl.py:66] wikitext2 perplexity: 6.49992036819458
I1005 10:17:19.718732 967332 eval_ppl.py:66] c4 perplexity: 8.022844314575195

4 Bit QTIP (this model):

I1005 10:13:22.272425 967335 eval_ppl.py:66] wikitext2 perplexity: 6.610454082489014
I1005 10:19:33.215519 967335 eval_ppl.py:66] c4 perplexity: 8.128130912780762

Zeroshot Results with lm_eval 0.3.0

BF16:

|-------------|------:|--------|-----:|---|-----:|
|arc_challenge|      0|acc     |0.5196|±  |0.0146|
|             |       |acc_norm|0.5512|±  |0.0145|
|arc_easy     |      0|acc     |0.8182|±  |0.0079|
|             |       |acc_norm|0.7980|±  |0.0082|
|boolq        |      1|acc     |0.8410|±  |0.0064|
|piqa         |      0|acc     |0.8003|±  |0.0093|
|             |       |acc_norm|0.8090|±  |0.0092|
|winogrande   |      0|acc     |0.7380|±  |0.0124|

4 Bit QTIP (this model):

|-------------|------:|--------|-----:|---|-----:|
|arc_challenge|      0|acc     |0.5179|±  |0.0146|
|             |       |acc_norm|0.5486|±  |0.0145|
|arc_easy     |      0|acc     |0.8157|±  |0.0080|
|             |       |acc_norm|0.7942|±  |0.0083|
|boolq        |      1|acc     |0.8425|±  |0.0064|
|piqa         |      0|acc     |0.8025|±  |0.0093|
|             |       |acc_norm|0.8079|±  |0.0092|
|winogrande   |      0|acc     |0.7419|±  |0.0123|