Finetuned Whisper small for darija speech translation

This model is a fine-tuned version of openai/whisper-small on the Darija-C dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Bleu: 0.7440

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 1000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu
4.1244 0.625 5 4.0913 0.0
4.1401 1.25 10 3.8806 0.0
3.5438 1.875 15 3.0904 0.0
2.7946 2.5 20 2.2453 0.0023
2.1793 3.125 25 1.7106 0.0083
1.6133 3.75 30 1.2200 0.0310
1.1125 4.375 35 0.8124 0.1554
0.8674 5.0 40 0.4519 0.4140
0.4645 5.625 45 0.2318 0.5646
0.2348 6.25 50 0.1173 0.6654
0.1596 6.875 55 0.0513 0.7341
0.0745 7.5 60 0.0323 0.7247
0.0447 8.125 65 0.0136 0.7440
0.014 8.75 70 0.0113 0.7284
0.0185 9.375 75 0.0107 0.7352
0.0638 10.0 80 0.0421 0.7070
0.0472 10.625 85 0.0503 0.6970
0.0681 11.25 90 0.0879 0.6954
0.1465 11.875 95 0.0407 0.6819
0.0483 12.5 100 0.0835 0.6678
0.1844 13.125 105 0.0661 0.6744
0.0737 13.75 110 0.1486 0.6494
0.1454 14.375 115 0.1018 0.6439
0.1203 15.0 120 0.0444 0.7143
0.0858 15.625 125 0.0148 0.7320
0.0463 16.25 130 0.0726 0.6406
0.1464 16.875 135 0.0586 0.6699
0.0938 17.5 140 0.0447 0.6639
0.1116 18.125 145 0.0737 0.6801
0.1031 18.75 150 0.0906 0.6794
0.1601 19.375 155 0.1172 0.6540
0.1957 20.0 160 0.0271 0.7095
0.0043 20.625 165 0.0491 0.6874
0.1013 21.25 170 0.0221 0.7341
0.0506 21.875 175 0.0313 0.6938
0.0545 22.5 180 0.0664 0.6533
0.1434 23.125 185 0.0586 0.6346
0.0891 23.75 190 0.0947 0.6823
0.1784 24.375 195 0.1534 0.6343
0.3143 25.0 200 0.1054 0.6431
0.182 25.625 205 0.0546 0.6610
0.0698 26.25 210 0.0816 0.6662
0.1513 26.875 215 0.0420 0.7162
0.0759 27.5 220 0.0995 0.6411
0.191 28.125 225 0.0334 0.7012
0.0429 28.75 230 0.0748 0.6273
0.1608 29.375 235 0.1665 0.5937
0.2917 30.0 240 0.1436 0.6353
0.2379 30.625 245 0.0348 0.6940
0.0835 31.25 250 0.0238 0.7153
0.0293 31.875 255 0.0581 0.6983
0.0946 32.5 260 0.0471 0.7104
0.1223 33.125 265 0.0660 0.7389
0.1151 33.75 270 0.0598 0.7160
0.1367 34.375 275 0.1139 0.6796
0.1004 35.0 280 0.0553 0.7200
0.0921 35.625 285 0.0396 0.6818
0.0523 36.25 290 0.0691 0.6757
0.0866 36.875 295 0.0505 0.7211
0.1391 37.5 300 0.0480 0.6985
0.0674 38.125 305 0.0701 0.6544
0.058 38.75 310 0.0546 0.7081
0.1008 39.375 315 0.0587 0.6832
0.0989 40.0 320 0.0435 0.6986
0.053 40.625 325 0.0094 0.7107
0.0164 41.25 330 0.0218 0.7248
0.0541 41.875 335 0.0036 0.7274
0.0086 42.5 340 0.0126 0.7213
0.0288 43.125 345 0.0004 0.7440
0.0006 43.75 350 0.0007 0.7440
0.0008 44.375 355 0.0201 0.7187
0.0334 45.0 360 0.0220 0.7380
0.0401 45.625 365 0.0002 0.7440
0.0003 46.25 370 0.0375 0.7178
0.0575 46.875 375 0.0009 0.7440
0.0011 47.5 380 0.0088 0.7250
0.1052 48.125 385 0.0353 0.7248
0.0138 48.75 390 0.0002 0.7440
0.0003 49.375 395 0.0003 0.7440
0.0007 50.0 400 0.0001 0.7440
0.0001 50.625 405 0.0037 0.7415
0.0001 51.25 410 0.0158 0.7415
0.0387 51.875 415 0.0001 0.7440
0.0001 52.5 420 0.0008 0.7440
0.0025 53.125 425 0.0001 0.7440
0.0001 53.75 430 0.0001 0.7440
0.0001 54.375 435 0.0001 0.7440
0.0001 55.0 440 0.0001 0.7440
0.0 55.625 445 0.0000 0.7440
0.0 56.25 450 0.0000 0.7440
0.0 56.875 455 0.0000 0.7440
0.0 57.5 460 0.0000 0.7440
0.0 58.125 465 0.0000 0.7440
0.0 58.75 470 0.0000 0.7440
0.0 59.375 475 0.0000 0.7440
0.0 60.0 480 0.0000 0.7440
0.0 60.625 485 0.0000 0.7440
0.0 61.25 490 0.0000 0.7440
0.0 61.875 495 0.0000 0.7440
0.0 62.5 500 0.0000 0.7440
0.0 63.125 505 0.0000 0.7440
0.0 63.75 510 0.0000 0.7440
0.0 64.375 515 0.0000 0.7440
0.0 65.0 520 0.0000 0.7440
0.0 65.625 525 0.0000 0.7440
0.0 66.25 530 0.0000 0.7440
0.0 66.875 535 0.0000 0.7440
0.0 67.5 540 0.0000 0.7440
0.0 68.125 545 0.0000 0.7440
0.0 68.75 550 0.0000 0.7440
0.0 69.375 555 0.0000 0.7440
0.0 70.0 560 0.0000 0.7440
0.0 70.625 565 0.0000 0.7440
0.0 71.25 570 0.0000 0.7440
0.0 71.875 575 0.0000 0.7440
0.0 72.5 580 0.0000 0.7440
0.0 73.125 585 0.0000 0.7440
0.0 73.75 590 0.0000 0.7440
0.0 74.375 595 0.0000 0.7440
0.0 75.0 600 0.0000 0.7440
0.0 75.625 605 0.0000 0.7440
0.0 76.25 610 0.0000 0.7440
0.0 76.875 615 0.0000 0.7440
0.0 77.5 620 0.0000 0.7440
0.0 78.125 625 0.0000 0.7440
0.0 78.75 630 0.0000 0.7440
0.0 79.375 635 0.0000 0.7440
0.0 80.0 640 0.0000 0.7440
0.0 80.625 645 0.0000 0.7440
0.0 81.25 650 0.0000 0.7440
0.0 81.875 655 0.0000 0.7440
0.0 82.5 660 0.0000 0.7440
0.0 83.125 665 0.0000 0.7440
0.0 83.75 670 0.0000 0.7440
0.0 84.375 675 0.0000 0.7440
0.0 85.0 680 0.0000 0.7440
0.0 85.625 685 0.0000 0.7440
0.0 86.25 690 0.0000 0.7440
0.0 86.875 695 0.0000 0.7440
0.0 87.5 700 0.0000 0.7440
0.0 88.125 705 0.0000 0.7440
0.0 88.75 710 0.0000 0.7440
0.0 89.375 715 0.0000 0.7440
0.0 90.0 720 0.0000 0.7440
0.0 90.625 725 0.0000 0.7440
0.0 91.25 730 0.0000 0.7440
0.0 91.875 735 0.0000 0.7440
0.0 92.5 740 0.0000 0.7440
0.0 93.125 745 0.0000 0.7440
0.0 93.75 750 0.0000 0.7440
0.0 94.375 755 0.0000 0.7440
0.0 95.0 760 0.0000 0.7440
0.0 95.625 765 0.0000 0.7440
0.0 96.25 770 0.0000 0.7440
0.0 96.875 775 0.0000 0.7440
0.0 97.5 780 0.0000 0.7440
0.0 98.125 785 0.0000 0.7440
0.0 98.75 790 0.0000 0.7440
0.0 99.375 795 0.0000 0.7440
0.0 100.0 800 0.0000 0.7440
0.0 100.625 805 0.0000 0.7440
0.0 101.25 810 0.0000 0.7440
0.0 101.875 815 0.0000 0.7440
0.0 102.5 820 0.0000 0.7440
0.0 103.125 825 0.0000 0.7440
0.0 103.75 830 0.0000 0.7440
0.0 104.375 835 0.0000 0.7440
0.0 105.0 840 0.0000 0.7440
0.0 105.625 845 0.0000 0.7440
0.0 106.25 850 0.0000 0.7440
0.0 106.875 855 0.0000 0.7440
0.0 107.5 860 0.0000 0.7440
0.0 108.125 865 0.0000 0.7440
0.0 108.75 870 0.0000 0.7440
0.0 109.375 875 0.0000 0.7440
0.0 110.0 880 0.0000 0.7440
0.0 110.625 885 0.0000 0.7440
0.0 111.25 890 0.0000 0.7440
0.0 111.875 895 0.0000 0.7440
0.0 112.5 900 0.0000 0.7440
0.0 113.125 905 0.0000 0.7440
0.0 113.75 910 0.0000 0.7440
0.0 114.375 915 0.0000 0.7440
0.0 115.0 920 0.0000 0.7440
0.0 115.625 925 0.0000 0.7440
0.0 116.25 930 0.0000 0.7440
0.0 116.875 935 0.0000 0.7440
0.0 117.5 940 0.0000 0.7440
0.0 118.125 945 0.0000 0.7440
0.0 118.75 950 0.0000 0.7440
0.0 119.375 955 0.0000 0.7440
0.0 120.0 960 0.0000 0.7440
0.0 120.625 965 0.0000 0.7440
0.0 121.25 970 0.0000 0.7440
0.0 121.875 975 0.0000 0.7440
0.0 122.5 980 0.0000 0.7440
0.0 123.125 985 0.0000 0.7440
0.0 123.75 990 0.0000 0.7440
0.0 124.375 995 0.0000 0.7440
0.0 125.0 1000 0.0000 0.7440

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 2.19.2
  • Tokenizers 0.21.0
Downloads last month
76
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Marialab/finetuned-whisper-small-1000-step

Finetuned
(2238)
this model