wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-1hr-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8810
  • Wer: 0.4988
  • Cer: 0.1672

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
17.0 1.0 36 13.0736 1.0 1.0
9.9036 2.0 72 4.8880 1.0 1.0
4.684 3.0 108 3.5599 1.0 1.0
3.5015 4.0 144 3.1648 1.0 1.0
3.201 5.0 180 3.0654 1.0 1.0
3.1147 6.0 216 3.1915 1.0 1.0
3.0914 7.0 252 2.9619 1.0 1.0
3.01 8.0 288 3.0046 1.0 1.0
2.9785 9.0 324 2.9234 1.0 1.0
2.932 10.0 360 2.9227 1.0 1.0
2.8853 11.0 396 2.8842 1.0 1.0
2.7422 12.0 432 2.4736 0.9999 0.9446
2.0966 13.0 468 1.5906 0.9995 0.4546
1.449 14.0 504 1.3529 0.8594 0.3155
1.2739 15.0 540 1.2643 0.7826 0.2549
0.968 16.0 576 1.1934 0.7199 0.2297
0.8544 17.0 612 1.1714 0.6661 0.2161
0.7248 18.0 648 1.1922 0.6587 0.2126
0.6452 19.0 684 1.3711 0.6823 0.2196
0.6399 20.0 720 1.2777 0.6351 0.2120
0.5218 21.0 756 1.3353 0.6113 0.2011
0.5141 22.0 792 1.3149 0.6116 0.1995
0.4709 23.0 828 1.2793 0.6262 0.2050
0.4386 24.0 864 1.3153 0.6057 0.1971
0.3992 25.0 900 1.3247 0.6032 0.1970
0.3569 26.0 936 1.4275 0.5992 0.1980
0.3628 27.0 972 1.3171 0.5915 0.1924
0.3241 28.0 1008 1.3894 0.5791 0.1904
0.3993 29.0 1044 1.4247 0.5856 0.1942
0.2921 30.0 1080 1.4364 0.5721 0.1889
0.2929 31.0 1116 1.4470 0.5646 0.1875
0.2705 32.0 1152 1.3813 0.5596 0.1865
0.2675 33.0 1188 1.5556 0.5587 0.1857
0.2917 34.0 1224 1.4195 0.5680 0.1886
0.2571 35.0 1260 1.5744 0.5683 0.1871
0.2378 36.0 1296 1.5611 0.5588 0.1850
0.2181 37.0 1332 1.6092 0.5618 0.1869
0.2197 38.0 1368 1.5259 0.5727 0.1890
0.2022 39.0 1404 1.5426 0.5594 0.1862
0.1899 40.0 1440 1.5704 0.5645 0.1841
0.1995 41.0 1476 1.5666 0.5660 0.1834
0.1972 42.0 1512 1.6442 0.5521 0.1843
0.1749 43.0 1548 1.6143 0.5566 0.1836
0.1569 44.0 1584 1.6420 0.5598 0.1844
0.1659 45.0 1620 1.7003 0.5542 0.1845
0.1969 46.0 1656 1.4453 0.5482 0.1813
0.1609 47.0 1692 1.6009 0.5539 0.1838
0.1613 48.0 1728 1.6792 0.5512 0.1843
0.1498 49.0 1764 1.5508 0.5443 0.1827
0.1437 50.0 1800 1.7122 0.5340 0.1794
0.1674 51.0 1836 1.6303 0.5330 0.1787
0.1368 52.0 1872 1.7204 0.5476 0.1819
0.1247 53.0 1908 1.7727 0.5435 0.1825
0.1321 54.0 1944 1.7033 0.5361 0.1788
0.116 55.0 1980 1.6836 0.5356 0.1789
0.1095 56.0 2016 1.7173 0.5367 0.1784
0.1236 57.0 2052 1.8125 0.5406 0.1791
0.1123 58.0 2088 1.7084 0.5340 0.1783
0.1103 59.0 2124 1.6993 0.5348 0.1786
0.105 60.0 2160 1.7396 0.5214 0.1743
0.105 61.0 2196 1.7277 0.5288 0.1762
0.1045 62.0 2232 1.7564 0.5295 0.1772
0.099 63.0 2268 1.7446 0.5183 0.1731
0.091 64.0 2304 1.8399 0.5235 0.1763
0.1165 65.0 2340 1.7453 0.5284 0.1770
0.0933 66.0 2376 1.7183 0.5201 0.1730
0.0945 67.0 2412 1.7575 0.5244 0.1751
0.0943 68.0 2448 1.8292 0.5179 0.1731
0.0804 69.0 2484 1.7515 0.5130 0.1715
0.0936 70.0 2520 1.7478 0.5197 0.1736
0.0847 71.0 2556 1.7778 0.5212 0.1750
0.0758 72.0 2592 1.8291 0.5167 0.1728
0.0787 73.0 2628 1.8027 0.5117 0.1712
0.0839 74.0 2664 1.7828 0.5160 0.1726
0.0691 75.0 2700 1.7989 0.5102 0.1714
0.0752 76.0 2736 1.8084 0.5112 0.1708
0.0706 77.0 2772 1.8100 0.5121 0.1709
0.0778 78.0 2808 1.7763 0.5085 0.1700
0.0631 79.0 2844 1.8313 0.5091 0.1696
0.0729 80.0 2880 1.8528 0.5055 0.1699
0.0656 81.0 2916 1.8918 0.5105 0.1711
0.078 82.0 2952 1.8473 0.5076 0.1718
0.0792 83.0 2988 1.7290 0.5054 0.1693
0.0649 84.0 3024 1.8294 0.5093 0.1695
0.0647 85.0 3060 1.8810 0.5023 0.1685
0.0656 86.0 3096 1.7913 0.5043 0.1683
0.0566 87.0 3132 1.8506 0.5049 0.1684
0.0619 88.0 3168 1.8519 0.5043 0.1677
0.0718 89.0 3204 1.8385 0.4996 0.1667
0.0562 90.0 3240 1.8502 0.5030 0.1675
0.0593 91.0 3276 1.8384 0.5038 0.1675
0.0632 92.0 3312 1.8463 0.5026 0.1679
0.0545 93.0 3348 1.8528 0.5009 0.1680
0.0566 94.0 3384 1.8471 0.4995 0.1675
0.0593 95.0 3420 1.8420 0.4989 0.1673
0.0578 96.0 3456 1.8687 0.4982 0.1670
0.0542 97.0 3492 1.8701 0.4988 0.1672
0.0602 98.0 3528 1.8767 0.4991 0.1672
0.0561 99.0 3564 1.8789 0.4982 0.1670
0.06 100.0 3600 1.8810 0.4988 0.1672

Framework versions

  • Transformers 4.48.0
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
5
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-1hr-v1

Finetuned
(539)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-1hr-v1