Tri-Distil-BERT / README.md
md-nishat-008's picture
Update README.md
acc410b
|
raw
history blame
228 Bytes
---
license: apache-2.0
---
The model is pretrained on the OSCAR dataset for Bangla, English and Hindi.
The base model is Distil-BERT and the intended use for this model is for the datasets that contain a mix of these languages.