SheldonSides
commited on
Commit
·
eb8b07c
1
Parent(s):
08fd542
Update README.md
Browse files
README.md
CHANGED
@@ -42,7 +42,7 @@ It achieves the following results on the evaluation set:
|
|
42 |
|
43 |
## Model description
|
44 |
|
45 |
-
|
46 |
DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a
|
47 |
self-supervised fashion, using the BERT base model as a teacher. This means it was pretrained on the raw texts only,
|
48 |
with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic
|
|
|
42 |
|
43 |
## Model description
|
44 |
|
45 |
+
#### Base Model Info.
|
46 |
DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a
|
47 |
self-supervised fashion, using the BERT base model as a teacher. This means it was pretrained on the raw texts only,
|
48 |
with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic
|