metadata
language:
- nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Large V2
results: []
Whisper Large V2
This model is a fine-tuned version of openai/whisper-large-v2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2535
- Wer: 8.9988
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.6185 | 0.09 | 30 | 0.3181 | 12.1555 |
0.3243 | 0.19 | 60 | 0.2801 | 11.9994 |
0.3044 | 0.28 | 90 | 0.2689 | 11.9876 |
0.2843 | 0.38 | 120 | 0.2574 | 10.4270 |
0.2859 | 0.47 | 150 | 0.2427 | 12.1879 |
0.271 | 0.57 | 180 | 0.2374 | 14.0459 |
0.2584 | 0.66 | 210 | 0.2319 | 11.1690 |
0.2916 | 0.76 | 240 | 0.2302 | 13.6013 |
0.2781 | 0.85 | 270 | 0.2224 | 10.6832 |
0.2498 | 0.95 | 300 | 0.2244 | 10.2945 |
0.2033 | 1.04 | 330 | 0.2311 | 11.3045 |
0.1323 | 1.14 | 360 | 0.2268 | 10.9393 |
0.1322 | 1.23 | 390 | 0.2242 | 9.9912 |
0.1312 | 1.33 | 420 | 0.2267 | 14.3993 |
0.1392 | 1.42 | 450 | 0.2209 | 9.9352 |
0.1437 | 1.52 | 480 | 0.2146 | 10.0824 |
0.1299 | 1.61 | 510 | 0.2198 | 16.3516 |
0.1328 | 1.71 | 540 | 0.2161 | 10.0118 |
0.1425 | 1.8 | 570 | 0.2133 | 11.3280 |
0.1332 | 1.9 | 600 | 0.2137 | 10.4476 |
0.1354 | 1.99 | 630 | 0.2101 | 10.0324 |
0.0601 | 2.09 | 660 | 0.2241 | 9.2285 |
0.0557 | 2.18 | 690 | 0.2235 | 9.0548 |
0.0567 | 2.28 | 720 | 0.2239 | 9.5259 |
0.0583 | 2.37 | 750 | 0.2246 | 11.3575 |
0.0642 | 2.47 | 780 | 0.2241 | 9.7556 |
0.059 | 2.56 | 810 | 0.2256 | 10.1266 |
0.0596 | 2.66 | 840 | 0.2228 | 9.5318 |
0.0571 | 2.75 | 870 | 0.2206 | 12.1290 |
0.0581 | 2.85 | 900 | 0.2222 | 10.4240 |
0.063 | 2.94 | 930 | 0.2229 | 9.3551 |
0.0428 | 3.04 | 960 | 0.2313 | 9.8557 |
0.0237 | 3.13 | 990 | 0.2337 | 9.7261 |
0.0228 | 3.23 | 1020 | 0.2380 | 9.3433 |
0.022 | 3.32 | 1050 | 0.2403 | 9.6849 |
0.0235 | 3.42 | 1080 | 0.2342 | 9.5878 |
0.0206 | 3.51 | 1110 | 0.2341 | 9.0371 |
0.0205 | 3.61 | 1140 | 0.2391 | 9.2668 |
0.022 | 3.7 | 1170 | 0.2336 | 9.6496 |
0.0201 | 3.8 | 1200 | 0.2363 | 9.4876 |
0.0213 | 3.89 | 1230 | 0.2303 | 9.5819 |
0.0206 | 3.99 | 1260 | 0.2348 | 9.4670 |
0.0098 | 4.08 | 1290 | 0.2450 | 9.4729 |
0.0088 | 4.18 | 1320 | 0.2497 | 9.1461 |
0.0076 | 4.27 | 1350 | 0.2497 | 9.2815 |
0.0086 | 4.37 | 1380 | 0.2509 | 9.0901 |
0.0064 | 4.46 | 1410 | 0.2524 | 8.9164 |
0.0075 | 4.56 | 1440 | 0.2539 | 8.9340 |
0.0069 | 4.65 | 1470 | 0.2532 | 8.9870 |
0.0083 | 4.75 | 1500 | 0.2529 | 9.0135 |
0.0064 | 4.84 | 1530 | 0.2536 | 8.9605 |
0.0065 | 4.94 | 1560 | 0.2535 | 8.9988 |
Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0