miosipof/whisper-small-ft-balbus-sep28k-v1.5
This model is a fine-tuned version of openai/whisper-small on the Apple dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.1218
- Accuracy: 0.8121
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- training_steps: 2000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.173 | 0.1253 | 50 | 0.1725 | 0.5682 |
0.1729 | 0.2506 | 100 | 0.1715 | 0.5655 |
0.1707 | 0.3759 | 150 | 0.1701 | 0.5680 |
0.1692 | 0.5013 | 200 | 0.1684 | 0.5708 |
0.1661 | 0.6266 | 250 | 0.1666 | 0.5762 |
0.1642 | 0.7519 | 300 | 0.1613 | 0.6346 |
0.158 | 0.8772 | 350 | 0.1572 | 0.6543 |
0.1522 | 1.0025 | 400 | 0.1475 | 0.6963 |
0.1402 | 1.1278 | 450 | 0.1324 | 0.7455 |
0.1258 | 1.2531 | 500 | 0.1230 | 0.7733 |
0.1222 | 1.3784 | 550 | 0.1157 | 0.7918 |
0.1091 | 1.5038 | 600 | 0.1112 | 0.8008 |
0.1123 | 1.6291 | 650 | 0.1089 | 0.8070 |
0.1107 | 1.7544 | 700 | 0.1177 | 0.7860 |
0.1137 | 1.8797 | 750 | 0.1086 | 0.8062 |
0.1061 | 2.0050 | 800 | 0.1063 | 0.8126 |
0.0981 | 2.1303 | 850 | 0.1071 | 0.8140 |
0.0957 | 2.2556 | 900 | 0.1097 | 0.8099 |
0.1006 | 2.3810 | 950 | 0.1055 | 0.8134 |
0.0974 | 2.5063 | 1000 | 0.1123 | 0.8092 |
0.0965 | 2.6316 | 1050 | 0.1078 | 0.8128 |
0.1 | 2.7569 | 1100 | 0.1109 | 0.8030 |
0.0985 | 2.8822 | 1150 | 0.1075 | 0.8098 |
0.1006 | 3.0075 | 1200 | 0.1058 | 0.8167 |
0.0832 | 3.1328 | 1250 | 0.1097 | 0.8151 |
0.0841 | 3.2581 | 1300 | 0.1097 | 0.8113 |
0.08 | 3.3835 | 1350 | 0.1104 | 0.8112 |
0.0843 | 3.5088 | 1400 | 0.1097 | 0.8139 |
0.0816 | 3.6341 | 1450 | 0.1125 | 0.8135 |
0.083 | 3.7594 | 1500 | 0.1097 | 0.8135 |
0.0854 | 3.8847 | 1550 | 0.1112 | 0.8170 |
0.0848 | 4.0100 | 1600 | 0.1083 | 0.8118 |
0.072 | 4.1353 | 1650 | 0.1161 | 0.8112 |
0.0734 | 4.2607 | 1700 | 0.1171 | 0.8126 |
0.0689 | 4.3860 | 1750 | 0.1208 | 0.8149 |
0.0682 | 4.5113 | 1800 | 0.1208 | 0.8118 |
0.0686 | 4.6366 | 1850 | 0.1215 | 0.8115 |
0.0698 | 4.7619 | 1900 | 0.1208 | 0.8120 |
0.0669 | 4.8872 | 1950 | 0.1219 | 0.8118 |
0.0698 | 5.0125 | 2000 | 0.1218 | 0.8121 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.2.0
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 19
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for miosipof/whisper-small-ft-balbus-sep28k-v1.4
Base model
openai/whisper-small