File size: 937 Bytes
160c4b6 28c686f 17633f5 973699a 17633f5 d14733f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
---
language:
- en
- si
base_model:
- facebook/m2m100_418M
pipeline_tag: translation
library_name: transformers
license: mit
---
- This is the trained model of the seq2seq approach of our Sinhala Transliteration solution that was submitted to the [Shared Task of IndoNLPWorkshop 2025 @COLING2025](https://indonlp-workshop.github.io/IndoNLP-Workshop/sharedTask/).
- The official codebase can be accessed from https://github.com/kasunw22/Sinhala-Transliterator
- Please be kind enough to cite our [paper](https://arxiv.org/abs/2501.00529) and don't hesitate to rate our original [repository](https://github.com/kasunw22/Sinhala-Transliterator)
```
@article{de2024sinhala,
title={Sinhala Transliteration: A Comparative Analysis Between Rule-based and Seq2Seq Approaches},
author={De Mel, Yomal and Wickramasinghe, Kasun and de Silva, Nisansa and Ranathunga, Surangika},
journal={arXiv preprint arXiv:2501.00529},
year={2024}
}
``` |