md-nishat-008
commited on
Commit
·
1990cf6
1
Parent(s):
a5020a8
Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
-
The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. The base model is Distil-BERT and the intended use for this model is for the datasets that contain a Code-mixing of these languages.
|
5 |
|
6 |
To cite:
|
7 |
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. And further pre-trained on 560k code-mixed data (Bangla-English-Hindi). The base model is Distil-BERT and the intended use for this model is for the datasets that contain a Code-mixing of these languages.
|
5 |
|
6 |
To cite:
|
7 |
|