jacobfulano commited on
Commit
68a6d88
1 Parent(s): 4f0fd4f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -19,7 +19,7 @@ March 2023
19
 
20
  ## Documentation
21
 
22
- * Blog post
23
  * [Github (mosaicml/examples/bert repo)](https://github.com/mosaicml/examples/tree/main/examples/bert)
24
 
25
  ## How to use
@@ -150,13 +150,13 @@ Full configuration details for pretraining MosaicBERT-Base can be found in the c
150
 
151
  ## Evaluation results
152
 
153
- When fine-tuned on downstream tasks, this model achieves the following results:
154
-
155
- GLUE test results:
156
 
157
  | Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
158
  |:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
159
- | | | | | | | | | | |
 
 
160
 
161
  ## Intended uses & limitations
162
 
 
19
 
20
  ## Documentation
21
 
22
+ * [Blog post](https://www.mosaicml.com/blog/mosaicbert)
23
  * [Github (mosaicml/examples/bert repo)](https://github.com/mosaicml/examples/tree/main/examples/bert)
24
 
25
  ## How to use
 
150
 
151
  ## Evaluation results
152
 
153
+ When fine-tuned on downstream tasks (following the [finetuning details here](https://github.com/mosaicml/examples/blob/main/examples/bert/yamls/finetuning/glue/mosaic-bert-base-uncased.yaml)), the MosaicBERT model achieves the following GLUE results:
 
 
154
 
155
  | Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
156
  |:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
157
+ | | 0.8495 | 0.9029 | 0.9074| 0.9246 | 0.5511 | 0.8927 | 0.9003 | 0.8136 | 0.8428 |
158
+
159
+ Note that this is averaged over n=5 pretraining seeds.
160
 
161
  ## Intended uses & limitations
162