Update README.md
Browse files
README.md
CHANGED
@@ -7,8 +7,8 @@ widget:
|
|
7 |
- example_title: "Example 1"
|
8 |
- text: " اهلا وسهلا بكم في [MASK] من سيربح المليون "
|
9 |
- example_title: "Example 2"
|
10 |
-
|
11 |
---
|
|
|
12 |
# Arabic BERT Model
|
13 |
|
14 |
**AraBERTMo** is an Arabic pre-trained language model based on [Google's BERT architechture](https://github.com/google-research/bert). AraBERTMo_base uses the same BERT-Base config.
|
@@ -21,7 +21,7 @@ All models are available on the `HuggingFace` model page under the [Ebtihal](htt
|
|
21 |
|
22 |
`AraBertMo_base_V1' model was pre-trained on ~3 million words:
|
23 |
|
24 |
-
-
|
25 |
|
26 |
|
27 |
## Training results
|
|
|
7 |
- example_title: "Example 1"
|
8 |
- text: " اهلا وسهلا بكم في [MASK] من سيربح المليون "
|
9 |
- example_title: "Example 2"
|
|
|
10 |
---
|
11 |
+
|
12 |
# Arabic BERT Model
|
13 |
|
14 |
**AraBERTMo** is an Arabic pre-trained language model based on [Google's BERT architechture](https://github.com/google-research/bert). AraBERTMo_base uses the same BERT-Base config.
|
|
|
21 |
|
22 |
`AraBertMo_base_V1' model was pre-trained on ~3 million words:
|
23 |
|
24 |
+
- [OSCAR](https://traces1.inria.fr/oscar/) - Arabic version "unshuffled_deduplicated_ar".
|
25 |
|
26 |
|
27 |
## Training results
|