Edit model card

Evenki Wav2Vec2-large-960h

This model is a fine-tuned version of facebook/wav2vec2-base-960h on the Evenki dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9874
  • Wer: 100.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
15.7212 0.1595 50 15.5310 100.0
2.6269 0.3190 100 2.8415 99.7180
2.3072 0.4785 150 1.9787 99.4047
1.0452 0.6380 200 0.8638 100.0
1.1478 0.7974 250 2.7045 97.8850
1.3178 0.9569 300 0.8292 97.8850
1.3027 1.1164 350 6.3059 100.0
0.8389 1.2759 400 7.4609 100.0
2.0418 1.4354 450 6.3700 100.0
1.1457 1.5949 500 2.8782 100.0
0.787 1.7544 550 2.3695 100.0
0.6559 1.9139 600 3.4332 100.0
0.9913 2.0734 650 2.2318 100.0
0.7569 2.2329 700 2.1465 100.0
0.6979 2.3923 750 1.0902 100.0
1.2206 2.5518 800 2.2949 100.0
0.8054 2.7113 850 2.7187 100.0
0.686 2.8708 900 2.4085 100.0
0.8234 3.0303 950 0.9381 100.0
0.9062 3.1898 1000 1.0240 100.0
1.2424 3.3493 1050 1.1234 100.0
0.8654 3.5088 1100 4.7173 100.0
0.7106 3.6683 1150 0.9443 100.0
0.7094 3.8278 1200 0.9422 100.0
0.7053 3.9872 1250 1.1659 100.0
0.7629 4.1467 1300 1.1088 100.0
0.715 4.3062 1350 1.0223 100.0
1.2843 4.4657 1400 1.0170 100.0
0.6653 4.6252 1450 0.9866 100.0
0.668 4.7847 1500 0.9866 100.0
0.7227 4.9442 1550 0.9874 100.0

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
94.4M params
Tensor type
F32
·
Inference API
or
This model can be loaded on Inference API (serverless).

Finetuned from