contents / README.md
Oyounghyun's picture
End of training
0936a3d verified
|
raw
history blame
2.66 kB
metadata
language:
  - ko
license: apache-2.0
base_model: openai/whisper-base
tags:
  - hf-asr-leaderboard
  - generated_from_trainer
datasets:
  - Oyounghyun/whisper_study_data
model-index:
  - name: study0627
    results: []

study0627

This model is a fine-tuned version of openai/whisper-base on the train dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0686
  • Cer: 2.4264

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
0.0599 1.9608 100 0.1089 4.6347
0.01 3.9216 200 0.0720 2.4264
0.0033 5.8824 300 0.0682 3.0807
0.002 7.8431 400 0.0670 2.5627
0.0014 9.8039 500 0.0667 2.6718
0.0011 11.7647 600 0.0667 2.5082
0.0009 13.7255 700 0.0668 2.7263
0.0008 15.6863 800 0.0671 2.5082
0.0007 17.6471 900 0.0674 2.5082
0.0006 19.6078 1000 0.0675 2.6172
0.0005 21.5686 1100 0.0675 2.5082
0.0005 23.5294 1200 0.0676 2.3991
0.0005 25.4902 1300 0.0679 2.3991
0.0004 27.4510 1400 0.0682 2.4264
0.0004 29.4118 1500 0.0682 2.4264
0.0004 31.3725 1600 0.0684 2.4264
0.0003 33.3333 1700 0.0684 2.4264
0.0004 35.2941 1800 0.0685 2.4264
0.0004 37.2549 1900 0.0685 2.4264
0.0003 39.2157 2000 0.0686 2.4264

Framework versions

  • Transformers 4.43.0.dev0
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1