Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ We then utilized a large-scale corpus of EHRs from over 3 million patient record
|
|
12 |
## Pretraining Data
|
13 |
|
14 |
The ClinicalBERT model was trained on a large multicenter dataset with a large corpus of 1.2B words of diverse diseases we constructed.
|
15 |
-
For more details, see here.
|
16 |
|
17 |
## Model Pretraining
|
18 |
|
|
|
12 |
## Pretraining Data
|
13 |
|
14 |
The ClinicalBERT model was trained on a large multicenter dataset with a large corpus of 1.2B words of diverse diseases we constructed.
|
15 |
+
<!-- For more details, see here. -->
|
16 |
|
17 |
## Model Pretraining
|
18 |
|