Edit model card

Whisper Large V2

This model is a fine-tuned version of openai/whisper-large-v2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2995
  • Wer: 10.8891

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Wer
0.5753 0.13 30 0.3392 14.8670
0.2975 0.25 60 0.3042 19.4403
0.2893 0.38 90 0.3097 38.1677
0.2771 0.51 120 0.2772 13.8256
0.2656 0.63 150 0.2709 15.7969
0.2518 0.76 180 0.2602 14.6440
0.2419 0.89 210 0.2567 16.0404
0.2391 1.01 240 0.2599 13.9781
0.1319 1.14 270 0.2546 13.2594
0.128 1.27 300 0.2591 18.8448
0.1299 1.39 330 0.2599 11.7135
0.1261 1.52 360 0.2587 13.1303
0.1329 1.65 390 0.2541 12.4850
0.1303 1.77 420 0.2501 11.9980
0.115 1.9 450 0.2506 13.2799
0.1189 2.03 480 0.2486 11.0270
0.0577 2.15 510 0.2554 12.2532
0.0566 2.28 540 0.2587 11.3702
0.0573 2.41 570 0.2651 10.9390
0.0533 2.53 600 0.2590 11.1473
0.0519 2.66 630 0.2636 10.8363
0.056 2.78 660 0.2577 11.3732
0.062 2.91 690 0.2537 11.9833
0.0447 3.04 720 0.2675 11.7017
0.0232 3.16 750 0.2757 11.7927
0.0237 3.29 780 0.2751 12.5378
0.0216 3.42 810 0.2791 12.0244
0.0195 3.54 840 0.2787 10.9390
0.022 3.67 870 0.2749 12.4967
0.0213 3.8 900 0.2757 11.9012
0.0201 3.92 930 0.2735 11.5052
0.0146 4.05 960 0.2854 11.3673
0.0096 4.18 990 0.2956 11.7399
0.008 4.3 1020 0.2955 11.8073
0.008 4.43 1050 0.2963 11.8044
0.0087 4.56 1080 0.2971 11.6519
0.0076 4.68 1110 0.2971 11.7017
0.0073 4.81 1140 0.2982 10.8246
0.0068 4.94 1170 0.2995 10.8891

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.6
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
1.54B params
Tensor type
F32
·
Inference API
or
This model can be loaded on Inference API (serverless).

Finetuned from