Edit model card

wav2vec2-commonvoice-20subset-xls-r-300m-gpu1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6717
  • Wer: 0.9321
  • Cer: 0.2636

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 13
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 26
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 1.9 400 44.1474 1.0 1.0
69.9134 3.81 800 6.6651 1.0 1.0
7.6107 5.71 1200 6.5059 1.0 1.0
6.4614 7.62 1600 6.4456 1.0 1.0
6.3268 9.52 2000 6.3535 1.0 1.0000
6.3268 11.43 2400 6.1350 1.0063 0.9689
6.0793 13.33 2800 5.1209 1.0325 0.8288
5.2214 15.24 3200 4.3109 1.0148 0.6910
4.1439 17.14 3600 3.8123 1.0280 0.6100
3.558 19.05 4000 3.4451 1.0091 0.5639
3.558 20.95 4400 3.0901 1.0091 0.5271
3.0826 22.86 4800 2.7863 1.0080 0.4950
2.6891 24.76 5200 2.5825 1.0 0.4749
2.3535 26.67 5600 2.3098 1.0029 0.4420
2.0582 28.57 6000 2.1619 0.9960 0.4246
2.0582 30.48 6400 2.0340 0.9960 0.4097
1.8555 32.38 6800 1.9168 0.9874 0.3966
1.6825 34.29 7200 1.8196 0.9806 0.3849
1.4992 36.19 7600 1.7298 0.9863 0.3751
1.3534 38.1 8000 1.6588 0.9897 0.3674
1.3534 40.0 8400 1.6227 0.9823 0.3629
1.2413 41.9 8800 1.5783 0.9914 0.3556
1.1372 43.81 9200 1.5328 0.9755 0.3453
1.029 45.71 9600 1.5342 0.9726 0.3448
0.9409 47.62 10000 1.4863 0.9760 0.3362
0.9409 49.52 10400 1.4664 0.9715 0.3313
0.8615 51.43 10800 1.4799 0.9669 0.3337
0.7901 53.33 11200 1.4427 0.9635 0.3200
0.7302 55.24 11600 1.4679 0.9561 0.3244
0.6738 57.14 12000 1.4386 0.9521 0.3163
0.6738 59.05 12400 1.4515 0.9492 0.3139
0.6116 60.95 12800 1.4543 0.9526 0.3132
0.5558 62.86 13200 1.4303 0.9635 0.3079
0.504 64.76 13600 1.4425 0.9503 0.3045
0.4573 66.67 14000 1.4330 0.9515 0.3039
0.4573 68.57 14400 1.4553 0.9521 0.3017
0.429 70.48 14800 1.4727 0.9492 0.3031
0.3906 72.38 15200 1.4562 0.9458 0.3001
0.3664 74.29 15600 1.4699 0.9469 0.3007
0.3331 76.19 16000 1.4874 0.9515 0.3019
0.3331 78.1 16400 1.4866 0.9395 0.2999
0.3189 80.0 16800 1.4830 0.9412 0.2990
0.2963 81.9 17200 1.5135 0.9486 0.2984
0.2839 83.81 17600 1.5121 0.9475 0.2953
0.2602 85.71 18000 1.5313 0.9401 0.2990
0.2602 87.62 18400 1.5082 0.9418 0.2902
0.2515 89.52 18800 1.5320 0.9458 0.2969
0.2371 91.43 19200 1.5632 0.9446 0.2974
0.2327 93.33 19600 1.5268 0.9441 0.2942
0.2097 95.24 20000 1.5531 0.9406 0.2986
0.2097 97.14 20400 1.5443 0.9412 0.2923
0.1975 99.05 20800 1.5483 0.9418 0.2918
0.1886 100.95 21200 1.5669 0.9401 0.2909
0.1831 102.86 21600 1.5583 0.9389 0.2897
0.1762 104.76 22000 1.5557 0.9441 0.2904
0.1762 106.67 22400 1.5734 0.9366 0.2877
0.1681 108.57 22800 1.5873 0.9418 0.2917
0.1681 110.48 23200 1.5834 0.9395 0.2898
0.1542 112.38 23600 1.5941 0.9395 0.2861
0.1532 114.29 24000 1.5816 0.9424 0.2862
0.1532 116.19 24400 1.5806 0.9384 0.2850
0.1471 118.1 24800 1.5898 0.9418 0.2859
0.138 120.0 25200 1.6176 0.9412 0.2895
0.1361 121.9 25600 1.5888 0.9395 0.2865
0.1328 123.81 26000 1.6248 0.9418 0.2855
0.1328 125.71 26400 1.5954 0.9424 0.2864
0.123 127.62 26800 1.6179 0.9389 0.2844
0.1213 129.52 27200 1.6266 0.9418 0.2847
0.115 131.43 27600 1.6193 0.9406 0.2815
0.1137 133.33 28000 1.6255 0.9481 0.2868
0.1137 135.24 28400 1.6178 0.9378 0.2818
0.1089 137.14 28800 1.6339 0.9401 0.2827
0.1107 139.05 29200 1.6422 0.9378 0.2837
0.101 140.95 29600 1.6294 0.9418 0.2815
0.1006 142.86 30000 1.6290 0.9395 0.2843
0.1006 144.76 30400 1.6260 0.9384 0.2816
0.0991 146.67 30800 1.6283 0.9395 0.2801
0.0988 148.57 31200 1.6348 0.9435 0.2818
0.1 150.48 31600 1.6505 0.9435 0.2784
0.0927 152.38 32000 1.6468 0.9441 0.2798
0.0927 154.29 32400 1.6486 0.9424 0.2769
0.0851 156.19 32800 1.6455 0.9452 0.2811
0.089 158.1 33200 1.6307 0.9378 0.2776
0.0863 160.0 33600 1.6386 0.9355 0.2792
0.0852 161.9 34000 1.6244 0.9361 0.2757
0.0852 163.81 34400 1.6420 0.9389 0.2762
0.0835 165.71 34800 1.6474 0.9395 0.2749
0.0802 167.62 35200 1.6536 0.9406 0.2782
0.0787 169.52 35600 1.6594 0.9452 0.2796
0.0798 171.43 36000 1.6528 0.9366 0.2757
0.0798 173.33 36400 1.6518 0.9389 0.2747
0.0731 175.24 36800 1.6534 0.9406 0.2764
0.0745 177.14 37200 1.6638 0.9429 0.2770
0.0714 179.05 37600 1.6393 0.9406 0.2754
0.0694 180.95 38000 1.6421 0.9406 0.2746
0.0694 182.86 38400 1.6625 0.9378 0.2755
0.0727 184.76 38800 1.6549 0.9389 0.2753
0.0673 186.67 39200 1.6575 0.9406 0.2756
0.0695 188.57 39600 1.6684 0.9389 0.2737
0.0664 190.48 40000 1.6717 0.9429 0.2757
0.0664 192.38 40400 1.6636 0.9406 0.2751
0.0643 194.29 40800 1.6702 0.9441 0.2732
0.0627 196.19 41200 1.6517 0.9338 0.2735
0.0617 198.1 41600 1.6519 0.9378 0.2711
0.059 200.0 42000 1.6579 0.9378 0.2732
0.059 201.9 42400 1.6542 0.9315 0.2701
0.0595 203.81 42800 1.6607 0.9361 0.2713
0.0602 205.71 43200 1.6488 0.9378 0.2717
0.0581 207.62 43600 1.6651 0.9384 0.2693
0.0573 209.52 44000 1.6697 0.9384 0.2715
0.0573 211.43 44400 1.6585 0.9384 0.2708
0.0523 213.33 44800 1.6599 0.9372 0.2723
0.0541 215.24 45200 1.6683 0.9384 0.2732
0.0607 217.14 45600 1.6696 0.9355 0.2718
0.0512 219.05 46000 1.6733 0.9384 0.2740
0.0512 220.95 46400 1.6803 0.9372 0.2709
0.0515 222.86 46800 1.6684 0.9366 0.2692
0.0507 224.76 47200 1.6695 0.9315 0.2710
0.0502 226.67 47600 1.6789 0.9372 0.2692
0.0496 228.57 48000 1.6619 0.9326 0.2705
0.0496 230.48 48400 1.6707 0.9355 0.2705
0.049 232.38 48800 1.6655 0.9361 0.2688
0.0482 234.29 49200 1.6643 0.9389 0.2706
0.047 236.19 49600 1.6750 0.9355 0.2706
0.0474 238.1 50000 1.6913 0.9395 0.2719
0.0474 240.0 50400 1.6857 0.9384 0.2702
0.0458 241.9 50800 1.6815 0.9349 0.2664
0.0456 243.81 51200 1.6596 0.9378 0.2672
0.0423 245.71 51600 1.6623 0.9315 0.2664
0.0436 247.62 52000 1.6647 0.9344 0.2680
0.0436 249.52 52400 1.6706 0.9326 0.2671
0.043 251.43 52800 1.6751 0.9321 0.2687
0.043 253.33 53200 1.6706 0.9292 0.2688
0.0424 255.24 53600 1.6663 0.9304 0.2686
0.0431 257.14 54000 1.6684 0.9321 0.2674
0.0431 259.05 54400 1.6758 0.9309 0.2679
0.0405 260.95 54800 1.6666 0.9298 0.2667
0.0413 262.86 55200 1.6773 0.9315 0.2661
0.0393 264.76 55600 1.6702 0.9287 0.2650
0.0408 266.67 56000 1.6664 0.9332 0.2658
0.0408 268.57 56400 1.6738 0.9326 0.2660
0.0381 270.48 56800 1.6668 0.9332 0.2651
0.0383 272.38 57200 1.6747 0.9338 0.2651
0.0367 274.29 57600 1.6667 0.9304 0.2639
0.0393 276.19 58000 1.6743 0.9321 0.2643
0.0393 278.1 58400 1.6712 0.9326 0.2644
0.0377 280.0 58800 1.6759 0.9321 0.2637
0.0379 281.9 59200 1.6714 0.9315 0.2635
0.0376 283.81 59600 1.6798 0.9326 0.2643
0.0374 285.71 60000 1.6803 0.9338 0.2638
0.0374 287.62 60400 1.6764 0.9332 0.2645
0.0372 289.52 60800 1.6796 0.9326 0.2634
0.0354 291.43 61200 1.6762 0.9309 0.2628
0.0365 293.33 61600 1.6761 0.9315 0.2627
0.0349 295.24 62000 1.6746 0.9321 0.2631
0.0349 297.14 62400 1.6716 0.9321 0.2636
0.0345 299.05 62800 1.6717 0.9321 0.2636

Framework versions

  • Transformers 4.31.0
  • Pytorch 1.13.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for scarlett623/wav2vec2-commonvoice-20subset-xls-r-300m-gpu1

Finetuned
(468)
this model

Dataset used to train scarlett623/wav2vec2-commonvoice-20subset-xls-r-300m-gpu1

Evaluation results