Model Card for Ancient Greek to Polish Interlinear Translation Model
This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.
Model Details
Model Description
- Developed By: Maciej Rapacz, AGH University of Kraków
- Model Type: Neural machine translation (T5-based)
- Base Model: PhilTa
- Tokenizer: PhilTa
- Language(s): Ancient Greek (source) → Polish (target)
- License: CC BY-NC-SA 4.0
- Tag Set: BH (Bible Hub)
- Text Preprocessing: Normalized
- Morphological Encoding: emb-concat
Model Performance
- BLEU Score: 0.26
- SemScore: 0.58
Model Sources
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.