Update README.md
Browse files
README.md
CHANGED
@@ -18,9 +18,9 @@ Clinical-Longformer consistently out-perform ClinicalBERT across 10 baseline dat
|
|
18 |
|
19 |
Load the model directly from Transformers:
|
20 |
```
|
21 |
-
from transformers import AutoTokenizer,
|
22 |
-
tokenizer = AutoTokenizer.from_pretrained("yikuan8/Clinical-Longformer")
|
23 |
-
model =
|
24 |
```
|
25 |
|
26 |
If you find our implementation helps, please consider citing this :)
|
|
|
18 |
|
19 |
Load the model directly from Transformers:
|
20 |
```
|
21 |
+
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
22 |
+
tokenizer = AutoTokenizer.from_pretrained("yikuan8/Clinical-Longformer", use_auth_token=True)
|
23 |
+
model = AutoModelForMaskedLM.from_pretrained("yikuan8/Clinical-Longformer", use_auth_token=True)
|
24 |
```
|
25 |
|
26 |
If you find our implementation helps, please consider citing this :)
|