speecht5_soome-V2 / README.md
catsOfpeople's picture
End of training
211a63a verified
metadata
library_name: transformers
license: mit
base_model: catsOfpeople/speecht5_finetuned_emirhan_soomea
tags:
  - generated_from_trainer
model-index:
  - name: speecht5_soome-V2
    results: []

speecht5_soome-V2

This model is a fine-tuned version of catsOfpeople/speecht5_finetuned_emirhan_soomea on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2695

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • training_steps: 3500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.9648 4.5198 100 0.4308
0.4495 9.0395 200 0.3583
0.384 13.5593 300 0.3418
0.3637 18.0791 400 0.3177
0.3443 22.5989 500 0.3119
0.3366 27.1186 600 0.3099
0.3328 31.6384 700 0.3222
0.3238 36.1582 800 0.3091
0.3196 40.6780 900 0.2960
0.3156 45.1977 1000 0.2977
0.3123 49.7175 1100 0.2960
0.3107 54.2373 1200 0.2904
0.3029 58.7571 1300 0.2891
0.2978 63.2768 1400 0.2904
0.3012 67.7966 1500 0.2855
0.2977 72.3164 1600 0.2863
0.2915 76.8362 1700 0.2855
0.2935 81.3559 1800 0.2853
0.2877 85.8757 1900 0.2794
0.2839 90.3955 2000 0.2820
0.2847 94.9153 2100 0.2781
0.2831 99.4350 2200 0.2799
0.283 103.9548 2300 0.2811
0.2792 108.4746 2400 0.2774
0.2788 112.9944 2500 0.2813
0.2793 117.5141 2600 0.2755
0.2746 122.0339 2700 0.2769
0.2735 126.5537 2800 0.2729
0.2728 131.0734 2900 0.2764
0.2735 135.5932 3000 0.2751
0.2726 140.1130 3100 0.2754
0.2691 144.6328 3200 0.2707
0.2711 149.1525 3300 0.2717
0.2679 153.6723 3400 0.2724
0.2665 158.1921 3500 0.2695

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1