jun-han commited on
Commit
cae4257
·
verified ·
1 Parent(s): 6bc34db

End of training

Browse files
Files changed (2) hide show
  1. README.md +8 -7
  2. generation_config.json +1 -1
README.md CHANGED
@@ -19,8 +19,8 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 16.1 dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.2113
23
- - Cer: 10.2912
24
 
25
  ## Model description
26
 
@@ -46,21 +46,22 @@ The following hyperparameters were used during training:
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
  - lr_scheduler_warmup_steps: 500
49
- - training_steps: 3000
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Cer |
55
  |:-------------:|:------:|:----:|:---------------:|:-------:|
56
- | 0.1043 | 1.3245 | 1000 | 0.2119 | 10.7080 |
57
- | 0.0308 | 2.6490 | 2000 | 0.2085 | 10.6226 |
58
- | 0.0085 | 3.9735 | 3000 | 0.2113 | 10.2912 |
 
59
 
60
 
61
  ### Framework versions
62
 
63
- - Transformers 4.40.0
64
  - Pytorch 2.1.2
65
  - Datasets 2.19.0
66
  - Tokenizers 0.19.1
 
19
 
20
  This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 16.1 dataset.
21
  It achieves the following results on the evaluation set:
22
+ - Loss: 0.2284
23
+ - Cer: 10.0577
24
 
25
  ## Model description
26
 
 
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
  - lr_scheduler_warmup_steps: 500
49
+ - training_steps: 4000
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Cer |
55
  |:-------------:|:------:|:----:|:---------------:|:-------:|
56
+ | 0.1053 | 1.3245 | 1000 | 0.2142 | 10.8687 |
57
+ | 0.0322 | 2.6490 | 2000 | 0.2128 | 10.3917 |
58
+ | 0.0086 | 3.9735 | 3000 | 0.2213 | 10.1155 |
59
+ | 0.0023 | 5.2980 | 4000 | 0.2284 | 10.0577 |
60
 
61
 
62
  ### Framework versions
63
 
64
+ - Transformers 4.40.1
65
  - Pytorch 2.1.2
66
  - Datasets 2.19.0
67
  - Tokenizers 0.19.1
generation_config.json CHANGED
@@ -252,5 +252,5 @@
252
  "transcribe": 50359,
253
  "translate": 50358
254
  },
255
- "transformers_version": "4.40.0"
256
  }
 
252
  "transcribe": 50359,
253
  "translate": 50358
254
  },
255
+ "transformers_version": "4.40.1"
256
  }