Commit
•
31ae43d
1
Parent(s):
6643bd6
Adding Evaluation Results (#10)
Browse files- Adding Evaluation Results (445bca5530ea5a99a2a2bf25f054a2040410bae5)
Co-authored-by: Open LLM Leaderboard PR Bot <leaderboard-pr-bot@users.noreply.huggingface.co>
README.md
CHANGED
@@ -285,4 +285,17 @@ Then you should be ready to generate!
|
|
285 |
eprint={2307.09288},
|
286 |
archivePrefix={arXiv},
|
287 |
}
|
288 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
285 |
eprint={2307.09288},
|
286 |
archivePrefix={arXiv},
|
287 |
}
|
288 |
+
```
|
289 |
+
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
290 |
+
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrcaxOpenChat-Preview2-13B)
|
291 |
+
|
292 |
+
| Metric | Value |
|
293 |
+
|-----------------------|---------------------------|
|
294 |
+
| Avg. | 50.76 |
|
295 |
+
| ARC (25-shot) | 62.37 |
|
296 |
+
| HellaSwag (10-shot) | 82.96 |
|
297 |
+
| MMLU (5-shot) | 58.68 |
|
298 |
+
| TruthfulQA (0-shot) | 51.23 |
|
299 |
+
| Winogrande (5-shot) | 77.19 |
|
300 |
+
| GSM8K (5-shot) | 14.1 |
|
301 |
+
| DROP (3-shot) | 8.75 |
|