Datasets:

Modalities:
Text
Languages:
English
ArXiv:
DOI:
License:
BerenMillidge commited on
Commit
a3433d1
1 Parent(s): 9270cee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -64,10 +64,10 @@ Zyda also outperforms Dolma, RefinedWeb, and Fineweb on 1.4B models trained on 5
64
 
65
  According to our evaluations, Zyda is the most performant per-token open dataset available in its non-starcoder variant on language tasks and tying with fineweb otherwise.
66
 
67
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65c05e75c084467acab2f84a/fXaQAOBDJpoaAr1clfTel.png)
68
 
69
  <center>
70
- <img src="https://cdn-uploads.huggingface.co/production/uploads/65c05e75c084467acab2f84a/fXaQAOBDJpoaAr1clfTel.png" width="600" alt="Zyda performance across steps.">
71
  </center>
72
 
73
  These results are an aggregate scores of classic language modelling evaluations (piqa, winogrande, openbookqa, arc-easy, arc-challenge) across time for a 1.4B model trained on 50B tokens of each dataset.
 
64
 
65
  According to our evaluations, Zyda is the most performant per-token open dataset available in its non-starcoder variant on language tasks and tying with fineweb otherwise.
66
 
67
+ <!-- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65c05e75c084467acab2f84a/fXaQAOBDJpoaAr1clfTel.png) -->
68
 
69
  <center>
70
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/65c05e75c084467acab2f84a/fXaQAOBDJpoaAr1clfTel.png" width="1200" alt="Zyda performance across steps.">
71
  </center>
72
 
73
  These results are an aggregate scores of classic language modelling evaluations (piqa, winogrande, openbookqa, arc-easy, arc-challenge) across time for a 1.4B model trained on 50B tokens of each dataset.