achal-tri commited on
Commit
ef18745
1 Parent(s): 23b5c8b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -4,12 +4,14 @@ license: apache-2.0
4
 
5
 
6
 
7
- <img src="https://cdn-uploads.huggingface.co/production/uploads/63118add64939fabc0108b28/BB42g4V8HTxb5dR4tcy8A.png" alt="DCLM Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
8
 
9
 
10
- # Model Card for DCLM-1B
11
 
12
- DCLM-1B is a 1.4 billion parameter language model trained on the DCLM-Baseline dataset, which was curated as part of the DataComp for Language Models (DCLM) benchmark. This model is designed to showcase the effectiveness of systematic data curation techniques for improving language model performance.
 
 
13
 
14
  ## Model Details
15
 
 
4
 
5
 
6
 
7
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/63118add64939fabc0108b28/BB42g4V8HTxb5dR4tcy8A.png" alt="DCLM Logo" width="300" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
8
 
9
 
10
+ Check out our more recent, higher performing model here! https://huggingface.co/TRI-ML/DCLM-1B/
11
 
12
+ # Model Card for DCLM-1B-v0
13
+
14
+ DCLM-1B-v0 is a 1.4 billion parameter language model trained on the DCLM-Baseline dataset, which was curated as part of the DataComp for Language Models (DCLM) benchmark. This model is designed to showcase the effectiveness of systematic data curation techniques for improving language model performance.
15
 
16
  ## Model Details
17