bagasshw's picture
End of training
47efd17 verified
metadata
library_name: transformers
language:
  - jv
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - whisper
  - javanese
  - asr
  - generated_from_trainer
datasets:
  - jv_id_asr_split
metrics:
  - wer
model-index:
  - name: Whisper-Tiny-Java-v3
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: jv_id_asr_split
          type: jv_id_asr_split
          config: jv_id_asr_source
          split: None
          args: jv_id_asr_source
        metrics:
          - name: Wer
            type: wer
            value: 0.2586507557925852

Whisper-Tiny-Java-v3

This model is a fine-tuned version of openai/whisper-tiny on the jv_id_asr_split dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2980
  • Wer: 0.2587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 10000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.1788 0.0540 500 0.9671 0.6590
0.8015 0.1081 1000 0.6977 0.5305
0.6498 0.1621 1500 0.5725 0.6670
0.5828 0.2161 2000 0.5094 0.4829
0.5226 0.2702 2500 0.4642 0.3860
0.4955 0.3242 3000 0.4341 0.3915
0.4616 0.3782 3500 0.4128 0.3540
0.4474 0.4323 4000 0.3900 0.3614
0.4387 0.4863 4500 0.3736 0.3563
0.4154 0.5403 5000 0.3606 0.3274
0.419 0.5944 5500 0.3495 0.3144
0.3799 0.6484 6000 0.3398 0.2922
0.3802 0.7024 6500 0.3290 0.3044
0.3611 0.7565 7000 0.3225 0.2823
0.3548 0.8105 7500 0.3168 0.2733
0.346 0.8645 8000 0.3105 0.2660
0.3547 0.9186 8500 0.3063 0.2708
0.3211 0.9726 9000 0.3019 0.2827
0.2718 1.0267 9500 0.2990 0.2660
0.2859 1.0807 10000 0.2980 0.2587

Framework versions

  • Transformers 4.50.0.dev0
  • Pytorch 2.6.0+cu126
  • Datasets 3.4.0
  • Tokenizers 0.21.1