Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,17 @@ widget:
|
|
9 |
|
10 |
**AraBERTMo** is an Arabic pre-trained language model based on [Google's BERT architechture](https://github.com/google-research/bert). AraBERTMo uses the same BERT-Base config.
|
11 |
|
12 |
-
|
13 |
All models are available on the `HuggingFace` model page under the [Ebtihal](https://huggingface.co/Ebtihal/) name. Checkpoints are available in PyTorch formats.
|
14 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15 |
```
|
|
|
|
9 |
|
10 |
**AraBERTMo** is an Arabic pre-trained language model based on [Google's BERT architechture](https://github.com/google-research/bert). AraBERTMo uses the same BERT-Base config.
|
11 |
|
12 |
+
AraBERTMo now comes in 10 new variants
|
13 |
All models are available on the `HuggingFace` model page under the [Ebtihal](https://huggingface.co/Ebtihal/) name. Checkpoints are available in PyTorch formats.
|
14 |
|
15 |
+
|
16 |
+
## Load Pretrained Model
|
17 |
+
|
18 |
+
You can use this model by installing `torch` or `tensorflow` and Huggingface library `transformers`. And you can use it directly by initializing it like this:
|
19 |
+
|
20 |
+
```python
|
21 |
+
from transformers import AutoTokenizer, AutoModel
|
22 |
+
tokenizer = AutoTokenizer.from_pretrained("Ebtihal/AraBertMo_base_V1")
|
23 |
+
model = AutoModelForMaskedLM.from_pretrained("Ebtihal/AraBertMo_base_V1")
|
24 |
```
|
25 |
+
|