Edit model card

Whisper Base NL

This model is a fine-tuned version of openai/whisper-base on the Common Voice 17.0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.343928
  • Wer: 19.003155

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 7500
  • mixed_precision_training: Native AMP

Training results

Training Loss Step Validation Loss Wer
0.3639 500 0.396971 24.3028
0.2625 1000 0.358340 22.5210
0.2212 1500 0.341232 21.0322
0.1455 2000 0.330033 20.2046
0.1406 2500 0.324484 20.0508
0.1244 3000 0.321562 19.5279
0.0848 3500 0.321506 19.5114
0.0844 4000 0.316492 19.1462
0.0731 4500 0.321992 19.0167
0.0515 5000 0.324720 19.1492
0.0532 5500 0.324773 19.0148
0.0426 6000 0.332404 19.0576
0.0328 6500 0.334900 18.8249
0.0327 7000 0.335876 19.0080
0.0252 7500 0.343928 19.0031

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
72.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for GerwinVanGiessen/whisper-base-nl-1

Finetuned
(357)
this model

Dataset used to train GerwinVanGiessen/whisper-base-nl-1

Evaluation results