Model Card for Ancient Greek to English Interlinear Translation Model
This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.
Model Details
Model Description
- Developed By: Maciej Rapacz, AGH University of Kraków
- Model Type: Neural machine translation (T5-based)
- Base Model: GreTa
- Tokenizer: GreTa
- Language(s): Ancient Greek (source) → English (target)
- License: CC BY-NC-SA 4.0
- Tag Set: BH (Bible Hub)
- Text Preprocessing: Diacritics
- Morphological Encoding: t-w-t (tags-within-text)
Model Performance
- BLEU Score: 14.70
- SemScore: 0.55
Model Sources
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.