Model Card for Ancient Greek to Polish Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: mT5-base
  • Tokenizer: mT5
  • Language(s): Ancient Greek (source) → Polish (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Diacritics
  • Morphological Encoding: emb-concat

Model Performance

  • BLEU Score: 0.63
  • SemScore: 0.63

Model Sources

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Dataset used to train mrapacz/interlinear-pl-mt5-base-emb-concat-diacritics-ob