| license: apache-2.0 | |
| The model is pretrained on the OSCAR dataset for Bangla, English and Hindi. | |
| The base model is Distil-BERT and the intended use for this model is for the datasets that contain a mix of these languages. | |
| To Cite: | |
| @article{raihan2023mixed, | |
| title={Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi}, | |
| author={Raihan, Md Nishat and Goswami, Dhiman and Mahmud, Antara}, | |
| journal={arXiv preprint arXiv:2309.10272}, | |
| year={2023} | |
| } |