metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-a-nomimo
results: []
whisper-a-nomimo
This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0351
- Wer: 22.9167
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.9699 | 0.9662 | 100 | 0.1287 | 46.6821 |
0.1644 | 1.9275 | 200 | 0.1624 | 445.0617 |
0.236 | 2.8889 | 300 | 0.0895 | 23.1481 |
0.0518 | 3.8502 | 400 | 0.0479 | 18.4414 |
0.0321 | 4.8116 | 500 | 0.0426 | 14.8148 |
0.03 | 5.7729 | 600 | 0.0482 | 19.4444 |
0.0218 | 6.7343 | 700 | 0.0325 | 11.6512 |
0.0143 | 7.6957 | 800 | 0.0439 | 15.2778 |
0.0147 | 8.6570 | 900 | 0.0339 | 11.9599 |
0.0104 | 9.6184 | 1000 | 0.0391 | 14.5833 |
0.0079 | 10.5797 | 1100 | 0.0338 | 33.9506 |
0.0054 | 11.5411 | 1200 | 0.0293 | 20.4475 |
0.0032 | 12.5024 | 1300 | 0.0357 | 14.3519 |
0.002 | 13.4638 | 1400 | 0.0327 | 18.0556 |
0.0023 | 14.4251 | 1500 | 0.0351 | 22.9167 |
Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0