whisper-uz / README.md
jamshidahmadov's picture
End of training
38948a2 verified
|
raw
history blame
1.91 kB
metadata
library_name: transformers
language:
  - uz
license: apache-2.0
base_model: jamshidahmadov/whisper-uz-v2
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper base uz V3 - Jamshid Ahmadov
    results: []

Whisper base uz V3 - Jamshid Ahmadov

This model is a fine-tuned version of jamshidahmadov/whisper-uz-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1568
  • Wer: 16.6718

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 2500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.2166 0.5333 500 0.2338 22.8910
0.1257 1.0661 1000 0.1892 18.7192
0.1153 1.5995 1500 0.1673 17.1644
0.0672 2.1323 2000 0.1603 17.1644
0.0663 2.6656 2500 0.1568 16.6718

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.4.0
  • Datasets 3.2.0
  • Tokenizers 0.21.0