File size: 1,034 Bytes
02c78f6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
---
license: cc-by-sa-4.0
language:
- pl
metrics:
- bleu
base_model:
- mT5-base
library_name: transformers
datasets:
- mrapacz/greek-interlinear-translations
---
# Model Card for Ancient Greek to Polish Interlinear Translation Model
This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.
## Model Details
### Model Description
- **Developed By:** Maciej Rapacz, AGH University of Kraków
- **Model Type:** Neural machine translation (T5-based)
- **Base Model:** mT5-base
- **Tokenizer:** mT5
- **Language(s):** Ancient Greek (source) → Polish (target)
- **License:** CC BY-NC-SA 4.0
- **Tag Set:** Unused
- **Text Preprocessing:** Normalized
- **Morphological Encoding:** baseline (text only, no morphological tags)
### Model Performance
- **BLEU Score:** 26.21
- **SemScore:** 0.85
### Model Sources
- **Repository:** https://github.com/mrapacz/loreslm-interlinear-translation
- **Paper:** https://aclanthology.org/2025.loreslm-1.11/
|