Update README.md
Browse files
README.md
CHANGED
@@ -477,6 +477,9 @@ Below are the evaluation results on the [Flores+200 devtest set](https://hugging
|
|
477 |
|
478 |
</details>
|
479 |
|
|
|
|
|
|
|
480 |
### English
|
481 |
|
482 |
This section presents the evaluation metrics for English translation tasks.
|
@@ -496,6 +499,11 @@ This section presents the evaluation metrics for English translation tasks.
|
|
496 |
|
497 |
<img src="./images/bleu_en.png" alt="English" width="100%"/>
|
498 |
|
|
|
|
|
|
|
|
|
|
|
499 |
### Spanish
|
500 |
|
501 |
This section presents the evaluation metrics for Spanish translation tasks.
|
@@ -515,8 +523,11 @@ This section presents the evaluation metrics for Spanish translation tasks.
|
|
515 |
|
516 |
<img src="./images/bleu_es.png" alt="English" width="100%"/>
|
517 |
|
|
|
518 |
|
519 |
-
|
|
|
|
|
520 |
### Catalan
|
521 |
|
522 |
This section presents the evaluation metrics for Catalan translation tasks.
|
@@ -536,6 +547,11 @@ This section presents the evaluation metrics for Catalan translation tasks.
|
|
536 |
|
537 |
<img src="./images/bleu_ca.png" alt="English" width="100%"/>
|
538 |
|
|
|
|
|
|
|
|
|
|
|
539 |
### Galician
|
540 |
|
541 |
This section presents the evaluation metrics for Galician translation tasks.
|
@@ -554,7 +570,11 @@ This section presents the evaluation metrics for Galician translation tasks.
|
|
554 |
|
555 |
<img src="./images/bleu_gl.png" alt="English" width="100%"/>
|
556 |
|
|
|
557 |
|
|
|
|
|
|
|
558 |
### Basque
|
559 |
|
560 |
This section presents the evaluation metrics for Basque translation tasks.
|
@@ -575,10 +595,15 @@ This section presents the evaluation metrics for Basque translation tasks.
|
|
575 |
|
576 |
<img src="./images/bleu_eu.png" alt="English" width="100%"/>
|
577 |
|
|
|
|
|
578 |
### Low-Resource Languages of Spain
|
579 |
|
580 |
The tables below summarize the performance metrics for English, Spanish, and Catalan to Asturian, Aranese and Aragonese compared against [Transducens/IbRo-nllb](https://huggingface.co/Transducens/IbRo-nllb) [(Galiano Jimenez, et al.)](https://aclanthology.org/2024.wmt-1.85/), NLLB-3.3 ([Costa-jussà et al., 2022](https://arxiv.org/abs/2207.04672)) and [SalamandraTA-2B](https://huggingface.co/BSC-LT/salamandraTA-2B).
|
581 |
|
|
|
|
|
|
|
582 |
#### English-XX
|
583 |
|
584 |
| | Source | Target | Bleu↑ | Ter↓ | ChrF↑ |
|
@@ -596,7 +621,12 @@ The tables below summarize the performance metrics for English, Spanish, and Cat
|
|
596 |
| SalamandraTA-7b-base | en | arg | 12.24 | 73.48 | 44.75 |
|
597 |
| Transducens/IbRo-nllb | en | arg | 14.07 | 70.37 | 46.89 |
|
598 |
|
|
|
|
|
599 |
|
|
|
|
|
|
|
600 |
#### Spanish-XX
|
601 |
|
602 |
| | Source | Target | Bleu↑ | Ter↓ | ChrF↑ |
|
@@ -617,7 +647,13 @@ The tables below summarize the performance metrics for English, Spanish, and Cat
|
|
617 |
| SalamandraTA-7b-instruct | es | arg | 47.54 | 36.57 | 72.38 |
|
618 |
| salamandraTA2B | es | arg | 44.57 | 37.93 | 71.32 |
|
619 |
|
|
|
|
|
|
|
|
|
|
|
620 |
|
|
|
621 |
#### Catalan-XX
|
622 |
|
623 |
|
@@ -639,7 +675,7 @@ The tables below summarize the performance metrics for English, Spanish, and Cat
|
|
639 |
| SalamandraTA-7b-instruct | ca | arg | 21.62 | 63.38 | 53.01 |
|
640 |
| salamandraTA2B | ca | arg | 18.6 | 65.82 | 51.21 |
|
641 |
|
642 |
-
|
643 |
|
644 |
## Ethical Considerations and Limitations
|
645 |
|
|
|
477 |
|
478 |
</details>
|
479 |
|
480 |
+
<details>
|
481 |
+
<summary>English evaluation</summary>
|
482 |
+
|
483 |
### English
|
484 |
|
485 |
This section presents the evaluation metrics for English translation tasks.
|
|
|
499 |
|
500 |
<img src="./images/bleu_en.png" alt="English" width="100%"/>
|
501 |
|
502 |
+
</details>
|
503 |
+
|
504 |
+
<details>
|
505 |
+
<summary>Spanish evaluation</summary>
|
506 |
+
|
507 |
### Spanish
|
508 |
|
509 |
This section presents the evaluation metrics for Spanish translation tasks.
|
|
|
523 |
|
524 |
<img src="./images/bleu_es.png" alt="English" width="100%"/>
|
525 |
|
526 |
+
</details>
|
527 |
|
528 |
+
<details>
|
529 |
+
<summary>Catalan evaluation</summary>
|
530 |
+
|
531 |
### Catalan
|
532 |
|
533 |
This section presents the evaluation metrics for Catalan translation tasks.
|
|
|
547 |
|
548 |
<img src="./images/bleu_ca.png" alt="English" width="100%"/>
|
549 |
|
550 |
+
</details>
|
551 |
+
|
552 |
+
<details>
|
553 |
+
<summary>Galician evaluation</summary>
|
554 |
+
|
555 |
### Galician
|
556 |
|
557 |
This section presents the evaluation metrics for Galician translation tasks.
|
|
|
570 |
|
571 |
<img src="./images/bleu_gl.png" alt="English" width="100%"/>
|
572 |
|
573 |
+
</details>
|
574 |
|
575 |
+
<details>
|
576 |
+
<summary>Basque evaluation</summary>
|
577 |
+
|
578 |
### Basque
|
579 |
|
580 |
This section presents the evaluation metrics for Basque translation tasks.
|
|
|
595 |
|
596 |
<img src="./images/bleu_eu.png" alt="English" width="100%"/>
|
597 |
|
598 |
+
</details>
|
599 |
+
|
600 |
### Low-Resource Languages of Spain
|
601 |
|
602 |
The tables below summarize the performance metrics for English, Spanish, and Catalan to Asturian, Aranese and Aragonese compared against [Transducens/IbRo-nllb](https://huggingface.co/Transducens/IbRo-nllb) [(Galiano Jimenez, et al.)](https://aclanthology.org/2024.wmt-1.85/), NLLB-3.3 ([Costa-jussà et al., 2022](https://arxiv.org/abs/2207.04672)) and [SalamandraTA-2B](https://huggingface.co/BSC-LT/salamandraTA-2B).
|
603 |
|
604 |
+
<details>
|
605 |
+
<summary>English evaluation</summary>
|
606 |
+
|
607 |
#### English-XX
|
608 |
|
609 |
| | Source | Target | Bleu↑ | Ter↓ | ChrF↑ |
|
|
|
621 |
| SalamandraTA-7b-base | en | arg | 12.24 | 73.48 | 44.75 |
|
622 |
| Transducens/IbRo-nllb | en | arg | 14.07 | 70.37 | 46.89 |
|
623 |
|
624 |
+
</details>
|
625 |
+
|
626 |
|
627 |
+
<details>
|
628 |
+
<summary>Spanish evaluation</summary>
|
629 |
+
|
630 |
#### Spanish-XX
|
631 |
|
632 |
| | Source | Target | Bleu↑ | Ter↓ | ChrF↑ |
|
|
|
647 |
| SalamandraTA-7b-instruct | es | arg | 47.54 | 36.57 | 72.38 |
|
648 |
| salamandraTA2B | es | arg | 44.57 | 37.93 | 71.32 |
|
649 |
|
650 |
+
</details>
|
651 |
+
|
652 |
+
|
653 |
+
<details>
|
654 |
+
<summary>Catalan evaluation</summary>
|
655 |
|
656 |
+
|
657 |
#### Catalan-XX
|
658 |
|
659 |
|
|
|
675 |
| SalamandraTA-7b-instruct | ca | arg | 21.62 | 63.38 | 53.01 |
|
676 |
| salamandraTA2B | ca | arg | 18.6 | 65.82 | 51.21 |
|
677 |
|
678 |
+
</details>
|
679 |
|
680 |
## Ethical Considerations and Limitations
|
681 |
|