md-nishat-008 commited on
Commit
0f266a8
·
1 Parent(s): 4be2e5d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -1,4 +1,7 @@
1
  ---
2
  license: apache-2.0
3
  ---
4
- The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. The base model is Distil-BERT and the intended use for this model is for the datasets that contain a Code-mixing of these languages.
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. The base model is Distil-BERT and the intended use for this model is for the datasets that contain a Code-mixing of these languages.
5
+
6
+ To cite:
7
+ Raihan, M. N., Goswami, D., & Mahmud, A. (2023). Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi. ArXiv. /abs/2309.10272