Update README.md
Browse files
README.md
CHANGED
@@ -17,6 +17,19 @@ This is an English & Chinese MoE Model , slightly different with [cloudyu/Mixtra
|
|
17 |
* [SUSTech/SUS-Chat-34B]
|
18 |
|
19 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
gpu code example
|
21 |
|
22 |
```
|
|
|
17 |
* [SUSTech/SUS-Chat-34B]
|
18 |
|
19 |
|
20 |
+
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
21 |
+
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B)
|
22 |
+
|
23 |
+
| Metric |Value|
|
24 |
+
|---------------------------------|----:|
|
25 |
+
|Avg. |76.72|
|
26 |
+
|AI2 Reasoning Challenge (25-Shot)|71.08|
|
27 |
+
|HellaSwag (10-Shot) |85.23|
|
28 |
+
|MMLU (5-Shot) |77.47|
|
29 |
+
|TruthfulQA (0-shot) |66.19|
|
30 |
+
|Winogrande (5-shot) |84.85|
|
31 |
+
|GSM8k (5-shot) |75.51|
|
32 |
+
|
33 |
gpu code example
|
34 |
|
35 |
```
|