Update README.md
Browse files
README.md
CHANGED
@@ -43,6 +43,27 @@ The released weights were trained on ~70 billion tokens.
|
|
43 |
We plan to continue training up to 300 billion tokens and update the weights at every 20b tokens.
|
44 |
This training run is monolingual and uses c4en and english wikipedia datasets.
|
45 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
46 |
## Installation
|
47 |
|
48 |
```shell
|
|
|
43 |
We plan to continue training up to 300 billion tokens and update the weights at every 20b tokens.
|
44 |
This training run is monolingual and uses c4en and english wikipedia datasets.
|
45 |
|
46 |
+
## Test results
|
47 |
+
|
48 |
+
These are the results from [EleutherAI/lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) at 80B (tokens trained) checkpoint.
|
49 |
+
|
50 |
+
| Task |Version| Metric |Value | |Stderr|
|
51 |
+
|--------------|------:|--------|-----:|---|-----:|
|
52 |
+
|anli_r1 | 0|acc |0.3150|± |0.0147|
|
53 |
+
|anli_r2 | 0|acc |0.3380|± |0.0150|
|
54 |
+
|anli_r3 | 0|acc |0.3367|± |0.0136|
|
55 |
+
|hellaswag | 0|acc |0.4761|± |0.0050|
|
56 |
+
| | |acc_norm|0.6308|± |0.0048|
|
57 |
+
|lambada_openai| 0|ppl |8.9700|± |0.2606|
|
58 |
+
| | |acc |0.5628|± |0.0069|
|
59 |
+
|mathqa | 0|acc |0.2318|± |0.0077|
|
60 |
+
| | |acc_norm|0.2372|± |0.0078|
|
61 |
+
|piqa | 0|acc |0.7448|± |0.0102|
|
62 |
+
| | |acc_norm|0.7639|± |0.0099|
|
63 |
+
|winogrande | 0|acc |0.5935|± |0.0138|
|
64 |
+
|wsc | 0|acc |0.4038|± |0.0483|
|
65 |
+
|
66 |
+
|
67 |
## Installation
|
68 |
|
69 |
```shell
|