Whisper-genshin-en-2-vocab
This model is a fine-tuned version of openai/whisper-small on the genshin-en-vocab dataset. It achieves the following results on the evaluation set:
- Loss: 1.0410
- Wer: 0.8868
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
15.0132 | 0.0847 | 10 | 14.9341 | 1.1887 |
13.4253 | 0.1695 | 20 | 10.7182 | 1.1887 |
8.5291 | 0.2542 | 30 | 5.9579 | 1.1698 |
5.457 | 0.3390 | 40 | 3.9144 | 1.1132 |
3.0251 | 0.4237 | 50 | 1.4680 | 1.0566 |
1.4178 | 0.5085 | 60 | 1.2124 | 0.9623 |
1.4055 | 0.5932 | 70 | 1.1406 | 0.9811 |
1.0228 | 0.6780 | 80 | 1.0921 | 0.9623 |
1.0819 | 0.7627 | 90 | 1.0417 | 0.9623 |
0.8353 | 0.8475 | 100 | 1.0185 | 0.9434 |
0.8447 | 0.9322 | 110 | 1.0180 | 0.9245 |
0.7283 | 1.0169 | 120 | 1.0234 | 0.8868 |
0.4207 | 1.1017 | 130 | 1.0425 | 0.9245 |
0.4697 | 1.1864 | 140 | 1.0700 | 0.9434 |
0.366 | 1.2712 | 150 | 1.0767 | 0.9245 |
0.5056 | 1.3559 | 160 | 1.0976 | 0.9623 |
0.3706 | 1.4407 | 170 | 1.0599 | 0.9811 |
0.3012 | 1.5254 | 180 | 1.0518 | 0.9245 |
0.2972 | 1.6102 | 190 | 1.0651 | 0.9245 |
0.2596 | 1.6949 | 200 | 1.0465 | 0.9057 |
0.2067 | 1.7797 | 210 | 0.9912 | 0.9057 |
0.3327 | 1.8644 | 220 | 1.0045 | 0.9057 |
0.2363 | 1.9492 | 230 | 1.0041 | 0.9057 |
0.1537 | 2.0339 | 240 | 1.0085 | 0.9057 |
0.0523 | 2.1186 | 250 | 1.0026 | 0.9245 |
0.0898 | 2.2034 | 260 | 0.9883 | 0.9245 |
0.059 | 2.2881 | 270 | 0.9740 | 0.9245 |
0.062 | 2.3729 | 280 | 0.9758 | 0.9057 |
0.0719 | 2.4576 | 290 | 1.0023 | 0.9057 |
0.0241 | 2.5424 | 300 | 1.0046 | 0.9245 |
0.0401 | 2.6271 | 310 | 1.0177 | 0.9057 |
0.0395 | 2.7119 | 320 | 1.0301 | 0.9057 |
0.052 | 2.7966 | 330 | 1.0400 | 0.9057 |
0.063 | 2.8814 | 340 | 1.0566 | 0.9245 |
0.0244 | 2.9661 | 350 | 1.0358 | 0.9057 |
0.0208 | 3.0508 | 360 | 1.0179 | 0.9245 |
0.0073 | 3.1356 | 370 | 1.0228 | 0.9245 |
0.0059 | 3.2203 | 380 | 1.0336 | 0.9245 |
0.0065 | 3.3051 | 390 | 1.0231 | 0.9245 |
0.0075 | 3.3898 | 400 | 1.0128 | 0.8679 |
0.0046 | 3.4746 | 410 | 1.0114 | 0.8679 |
0.0073 | 3.5593 | 420 | 1.0139 | 0.8679 |
0.0106 | 3.6441 | 430 | 1.0262 | 0.8679 |
0.0053 | 3.7288 | 440 | 1.0332 | 0.8679 |
0.0058 | 3.8136 | 450 | 1.0446 | 0.8679 |
0.0035 | 3.8983 | 460 | 1.0514 | 0.8679 |
0.0052 | 3.9831 | 470 | 1.0565 | 0.8868 |
0.002 | 4.0678 | 480 | 1.0525 | 0.8868 |
0.002 | 4.1525 | 490 | 1.0477 | 0.8868 |
0.0021 | 4.2373 | 500 | 1.0458 | 0.8868 |
0.0021 | 4.3220 | 510 | 1.0447 | 0.8868 |
0.002 | 4.4068 | 520 | 1.0446 | 0.8868 |
0.0021 | 4.4915 | 530 | 1.0434 | 0.8868 |
0.0019 | 4.5763 | 540 | 1.0434 | 0.8868 |
0.0018 | 4.6610 | 550 | 1.0425 | 0.8868 |
0.002 | 4.7458 | 560 | 1.0425 | 0.8868 |
0.0021 | 4.8305 | 570 | 1.0419 | 0.8868 |
0.0022 | 4.9153 | 580 | 1.0414 | 0.8868 |
0.0023 | 5.0 | 590 | 1.0410 | 0.8868 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1
- Downloads last month
- 85
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for shaunliu82714/whisper-finetuned-vocab-trained
Base model
openai/whisper-small