Update README.md
Browse files
README.md
CHANGED
@@ -57,8 +57,7 @@ Here are the evaluation results for DCLM-1B models on various tasks (using [llm-
|
|
57 |
|
58 |
| Task | Core | Extended | MMLU 5-shot |
|
59 |
|:---------:|:------:|:----------:|:-------------:|
|
60 |
-
| DCLM-1B |
|
61 |
-
| DCLM-1B-v2| 45.2 | 28.1 | 47.5 |
|
62 |
| DCLM-1B-IT| 47.1 | 33.6 | 51.4 |
|
63 |
|
64 |
Note: All scores are presented as decimal values between 0 and 1, representing the proportion of correct answers or the model's performance on each task.
|
|
|
57 |
|
58 |
| Task | Core | Extended | MMLU 5-shot |
|
59 |
|:---------:|:------:|:----------:|:-------------:|
|
60 |
+
| DCLM-1B | 45.2 | 28.1 | 47.5 |
|
|
|
61 |
| DCLM-1B-IT| 47.1 | 33.6 | 51.4 |
|
62 |
|
63 |
Note: All scores are presented as decimal values between 0 and 1, representing the proportion of correct answers or the model's performance on each task.
|