Edit model card

wav2vec2-large-xlsr-korean-demo-test

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9829
  • Wer: 0.5580

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
8.1603 0.4 400 5.0560 1.0
3.0513 0.79 800 2.1226 0.9984
1.7673 1.19 1200 1.2358 0.9273
1.4577 1.59 1600 1.0198 0.8512
1.3308 1.98 2000 0.9258 0.8325
1.1798 2.38 2400 0.8587 0.7933
1.1268 2.77 2800 0.8166 0.7677
1.0664 3.17 3200 0.7911 0.7428
0.9923 3.57 3600 0.7964 0.7481
1.0059 3.96 4000 0.7617 0.7163
0.9141 4.36 4400 0.7854 0.7280
0.8939 4.76 4800 0.7364 0.7160
0.8689 5.15 5200 0.7895 0.6996
0.8236 5.55 5600 0.7756 0.7100
0.8409 5.95 6000 0.7433 0.6915
0.7643 6.34 6400 0.7566 0.6993
0.7601 6.74 6800 0.7873 0.6836
0.7367 7.14 7200 0.7353 0.6640
0.7099 7.53 7600 0.7421 0.6766
0.7084 7.93 8000 0.7396 0.6740
0.6837 8.32 8400 0.7717 0.6647
0.6513 8.72 8800 0.7763 0.6798
0.6458 9.12 9200 0.7659 0.6494
0.6132 9.51 9600 0.7693 0.6511
0.6287 9.91 10000 0.7555 0.6469
0.6008 10.31 10400 0.7606 0.6408
0.5796 10.7 10800 0.7622 0.6397
0.5753 11.1 11200 0.7816 0.6510
0.5531 11.5 11600 0.8351 0.6658
0.5215 11.89 12000 0.7843 0.6416
0.5205 12.29 12400 0.7674 0.6256
0.5219 12.69 12800 0.7594 0.6287
0.5186 13.08 13200 0.7863 0.6243
0.473 13.48 13600 0.8209 0.6469
0.4938 13.87 14000 0.8002 0.6241
0.474 14.27 14400 0.8008 0.6122
0.442 14.67 14800 0.8047 0.6089
0.4521 15.06 15200 0.8341 0.6123
0.4289 15.46 15600 0.8217 0.6122
0.4278 15.86 16000 0.8400 0.6152
0.4051 16.25 16400 0.8634 0.6182
0.4063 16.65 16800 0.8486 0.6097
0.4101 17.05 17200 0.8825 0.6002
0.3896 17.44 17600 0.9575 0.6205
0.3833 17.84 18000 0.8946 0.6216
0.3678 18.24 18400 0.8905 0.5952
0.3715 18.63 18800 0.8918 0.5994
0.3748 19.03 19200 0.8856 0.5953
0.3485 19.42 19600 0.9326 0.5906
0.3522 19.82 20000 0.9237 0.5932
0.3551 20.22 20400 0.9274 0.5932
0.3339 20.61 20800 0.9075 0.5883
0.3354 21.01 21200 0.9306 0.5861
0.318 21.41 21600 0.8994 0.5854
0.3235 21.8 22000 0.9114 0.5831
0.3201 22.2 22400 0.9415 0.5867
0.308 22.6 22800 0.9695 0.5807
0.3049 22.99 23200 0.9166 0.5765
0.2858 23.39 23600 0.9643 0.5746
0.2938 23.79 24000 0.9461 0.5724
0.2856 24.18 24400 0.9658 0.5710
0.2827 24.58 24800 0.9534 0.5693
0.2745 24.97 25200 0.9436 0.5675
0.2705 25.37 25600 0.9849 0.5701
0.2656 25.77 26000 0.9854 0.5662
0.2645 26.16 26400 0.9795 0.5662
0.262 26.56 26800 0.9496 0.5626
0.2553 26.96 27200 0.9787 0.5659
0.2602 27.35 27600 0.9814 0.5640
0.2519 27.75 28000 0.9816 0.5631
0.2386 28.15 28400 1.0012 0.5580
0.2398 28.54 28800 0.9892 0.5567
0.2368 28.94 29200 0.9909 0.5590
0.2366 29.34 29600 0.9827 0.5567
0.2347 29.73 30000 0.9829 0.5580

Framework versions

  • Transformers 4.21.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.