Update README.md
Browse files
README.md
CHANGED
@@ -13,9 +13,9 @@ large batch size of 0.5M tokens. A larger 762 million parameter model can also b
|
|
13 |
#### How to use
|
14 |
|
15 |
```python
|
16 |
-
from transformers import MegatronBertModel,
|
17 |
model = MegatronBertModel.from_pretrained("mmukh/SOBertBase")
|
18 |
-
tokenizer =
|
19 |
|
20 |
```
|
21 |
|
|
|
13 |
#### How to use
|
14 |
|
15 |
```python
|
16 |
+
from transformers import MegatronBertModel,PreTrainedTokenizerFast
|
17 |
model = MegatronBertModel.from_pretrained("mmukh/SOBertBase")
|
18 |
+
tokenizer = PreTrainedTokenizerFast.from_pretrained("mmukh/SOBertBase")
|
19 |
|
20 |
```
|
21 |
|