md-nishat-008 commited on
Commit
402cd40
·
1 Parent(s): acc410b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -2,4 +2,8 @@
2
  license: apache-2.0
3
  ---
4
  The model is pretrained on the OSCAR dataset for Bangla, English and Hindi.
5
- The base model is Distil-BERT and the intended use for this model is for the datasets that contain a mix of these languages.
 
 
 
 
 
2
  license: apache-2.0
3
  ---
4
  The model is pretrained on the OSCAR dataset for Bangla, English and Hindi.
5
+ The base model is Distil-BERT and the intended use for this model is for the datasets that contain a mix of these languages.
6
+
7
+ To Cite:
8
+
9
+ Raihan, M. N., Goswami, D., & Mahmud, A. (2023). Mixed-Distil-BERT: Code-mixed Language Modeling for Bangla, English, and Hindi. ArXiv. /abs/2309.10272