md-nishat-008
commited on
Commit
·
e8456b0
1
Parent(s):
0f266a8
Update README.md
Browse files
README.md
CHANGED
@@ -4,4 +4,5 @@ license: apache-2.0
|
|
4 |
The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. The base model is Distil-BERT and the intended use for this model is for the datasets that contain a Code-mixing of these languages.
|
5 |
|
6 |
To cite:
|
|
|
7 |
Raihan, M. N., Goswami, D., & Mahmud, A. (2023). Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi. ArXiv. /abs/2309.10272
|
|
|
4 |
The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. The base model is Distil-BERT and the intended use for this model is for the datasets that contain a Code-mixing of these languages.
|
5 |
|
6 |
To cite:
|
7 |
+
|
8 |
Raihan, M. N., Goswami, D., & Mahmud, A. (2023). Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi. ArXiv. /abs/2309.10272
|