Model Card for Ancient Greek to English Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: GreTa
  • Tokenizer: GreTa
  • Language(s): Ancient Greek (source) → English (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Diacritics
  • Morphological Encoding: emb-concat

Model Performance

  • BLEU Score: 5.48
  • SemScore: 0.49

Model Sources

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Dataset used to train mrapacz/interlinear-en-greta-emb-concat-diacritics-ob