wav2vec2-large-xlsr-coraa-exp-6
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5587
- Wer: 0.3549
- Cer: 0.1821
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
38.8729 | 1.0 | 14 | 29.8196 | 1.0 | 0.9615 |
38.8729 | 2.0 | 28 | 9.8857 | 1.0 | 0.9619 |
38.8729 | 3.0 | 42 | 5.0224 | 1.0 | 0.9619 |
38.8729 | 4.0 | 56 | 4.1451 | 1.0 | 0.9619 |
38.8729 | 5.0 | 70 | 3.8275 | 1.0 | 0.9619 |
38.8729 | 6.0 | 84 | 3.6444 | 1.0 | 0.9619 |
38.8729 | 7.0 | 98 | 3.5199 | 1.0 | 0.9619 |
10.2228 | 8.0 | 112 | 3.4075 | 1.0 | 0.9619 |
10.2228 | 9.0 | 126 | 3.2890 | 1.0 | 0.9619 |
10.2228 | 10.0 | 140 | 3.2052 | 1.0 | 0.9619 |
10.2228 | 11.0 | 154 | 3.1952 | 1.0 | 0.9619 |
10.2228 | 12.0 | 168 | 3.1434 | 1.0 | 0.9619 |
10.2228 | 13.0 | 182 | 3.1563 | 1.0 | 0.9619 |
10.2228 | 14.0 | 196 | 3.1021 | 1.0 | 0.9619 |
3.1118 | 15.0 | 210 | 3.0893 | 1.0 | 0.9619 |
3.1118 | 16.0 | 224 | 3.0649 | 1.0 | 0.9619 |
3.1118 | 17.0 | 238 | 3.0773 | 1.0 | 0.9619 |
3.1118 | 18.0 | 252 | 3.0653 | 1.0 | 0.9619 |
3.1118 | 19.0 | 266 | 3.0556 | 1.0 | 0.9619 |
3.1118 | 20.0 | 280 | 3.0357 | 1.0 | 0.9619 |
3.1118 | 21.0 | 294 | 3.0212 | 1.0 | 0.9619 |
2.9669 | 22.0 | 308 | 3.0193 | 1.0 | 0.9619 |
2.9669 | 23.0 | 322 | 3.0091 | 1.0 | 0.9619 |
2.9669 | 24.0 | 336 | 3.0074 | 1.0 | 0.9619 |
2.9669 | 25.0 | 350 | 3.0022 | 1.0 | 0.9619 |
2.9669 | 26.0 | 364 | 3.0021 | 1.0 | 0.9619 |
2.9669 | 27.0 | 378 | 3.0017 | 1.0 | 0.9619 |
2.9669 | 28.0 | 392 | 2.9958 | 1.0 | 0.9619 |
2.9277 | 29.0 | 406 | 3.0000 | 1.0 | 0.9619 |
2.9277 | 30.0 | 420 | 2.9948 | 1.0 | 0.9619 |
2.9277 | 31.0 | 434 | 2.9936 | 1.0 | 0.9619 |
2.9277 | 32.0 | 448 | 2.9882 | 1.0 | 0.9619 |
2.9277 | 33.0 | 462 | 2.9866 | 1.0 | 0.9619 |
2.9277 | 34.0 | 476 | 2.9898 | 1.0 | 0.9619 |
2.9277 | 35.0 | 490 | 2.9887 | 1.0 | 0.9619 |
2.9186 | 36.0 | 504 | 2.9847 | 1.0 | 0.9619 |
2.9186 | 37.0 | 518 | 2.9852 | 1.0 | 0.9619 |
2.9186 | 38.0 | 532 | 2.9846 | 1.0 | 0.9619 |
2.9186 | 39.0 | 546 | 2.9766 | 1.0 | 0.9619 |
2.9186 | 40.0 | 560 | 2.9774 | 1.0 | 0.9619 |
2.9186 | 41.0 | 574 | 2.9617 | 1.0 | 0.9619 |
2.9186 | 42.0 | 588 | 2.9321 | 1.0 | 0.9619 |
2.8987 | 43.0 | 602 | 2.8981 | 1.0 | 0.9619 |
2.8987 | 44.0 | 616 | 2.8334 | 1.0 | 0.9619 |
2.8987 | 45.0 | 630 | 2.7951 | 1.0 | 0.9619 |
2.8987 | 46.0 | 644 | 2.7475 | 1.0 | 0.9619 |
2.8987 | 47.0 | 658 | 2.6645 | 1.0 | 0.9527 |
2.8987 | 48.0 | 672 | 2.6282 | 1.0 | 0.9541 |
2.8987 | 49.0 | 686 | 2.4763 | 1.0 | 0.8417 |
2.6692 | 50.0 | 700 | 2.2491 | 1.0 | 0.6703 |
2.6692 | 51.0 | 714 | 2.0035 | 1.0 | 0.5842 |
2.6692 | 52.0 | 728 | 1.7479 | 1.0 | 0.4960 |
2.6692 | 53.0 | 742 | 1.5238 | 1.0 | 0.4234 |
2.6692 | 54.0 | 756 | 1.3688 | 1.0 | 0.4067 |
2.6692 | 55.0 | 770 | 1.2722 | 0.9994 | 0.3967 |
2.6692 | 56.0 | 784 | 1.1672 | 0.9967 | 0.3818 |
2.6692 | 57.0 | 798 | 1.0985 | 0.8992 | 0.3311 |
1.6908 | 58.0 | 812 | 1.0185 | 0.8799 | 0.3196 |
1.6908 | 59.0 | 826 | 0.9546 | 0.8033 | 0.2906 |
1.6908 | 60.0 | 840 | 0.8950 | 0.6337 | 0.2470 |
1.6908 | 61.0 | 854 | 0.8433 | 0.5293 | 0.2234 |
1.6908 | 62.0 | 868 | 0.8039 | 0.4925 | 0.2156 |
1.6908 | 63.0 | 882 | 0.7747 | 0.4754 | 0.2117 |
1.6908 | 64.0 | 896 | 0.7777 | 0.4569 | 0.2091 |
0.9571 | 65.0 | 910 | 0.7666 | 0.4516 | 0.2074 |
0.9571 | 66.0 | 924 | 0.7772 | 0.4429 | 0.2072 |
0.9571 | 67.0 | 938 | 0.7258 | 0.4315 | 0.2026 |
0.9571 | 68.0 | 952 | 0.7159 | 0.4236 | 0.2013 |
0.9571 | 69.0 | 966 | 0.6914 | 0.4256 | 0.2006 |
0.9571 | 70.0 | 980 | 0.6768 | 0.4163 | 0.1992 |
0.9571 | 71.0 | 994 | 0.6966 | 0.4094 | 0.1981 |
0.6701 | 72.0 | 1008 | 0.6756 | 0.4108 | 0.1974 |
0.6701 | 73.0 | 1022 | 0.6746 | 0.4051 | 0.1964 |
0.6701 | 74.0 | 1036 | 0.6620 | 0.4007 | 0.1956 |
0.6701 | 75.0 | 1050 | 0.6627 | 0.4031 | 0.1957 |
0.6701 | 76.0 | 1064 | 0.6529 | 0.4007 | 0.1963 |
0.6701 | 77.0 | 1078 | 0.6478 | 0.3974 | 0.1947 |
0.6701 | 78.0 | 1092 | 0.6381 | 0.4017 | 0.1947 |
0.5683 | 79.0 | 1106 | 0.6425 | 0.3944 | 0.1935 |
0.5683 | 80.0 | 1120 | 0.6374 | 0.3917 | 0.1931 |
0.5683 | 81.0 | 1134 | 0.6219 | 0.3862 | 0.1911 |
0.5683 | 82.0 | 1148 | 0.6318 | 0.3854 | 0.1914 |
0.5683 | 83.0 | 1162 | 0.6325 | 0.3895 | 0.1933 |
0.5683 | 84.0 | 1176 | 0.6222 | 0.3852 | 0.1913 |
0.5683 | 85.0 | 1190 | 0.6149 | 0.3818 | 0.1897 |
0.4891 | 86.0 | 1204 | 0.6181 | 0.3805 | 0.1899 |
0.4891 | 87.0 | 1218 | 0.6089 | 0.3769 | 0.1889 |
0.4891 | 88.0 | 1232 | 0.6029 | 0.3748 | 0.1885 |
0.4891 | 89.0 | 1246 | 0.5954 | 0.3751 | 0.1872 |
0.4891 | 90.0 | 1260 | 0.5977 | 0.3755 | 0.1864 |
0.4891 | 91.0 | 1274 | 0.6000 | 0.3722 | 0.1873 |
0.4891 | 92.0 | 1288 | 0.5896 | 0.3740 | 0.1876 |
0.44 | 93.0 | 1302 | 0.5874 | 0.3781 | 0.1884 |
0.44 | 94.0 | 1316 | 0.5871 | 0.3716 | 0.1870 |
0.44 | 95.0 | 1330 | 0.5927 | 0.3740 | 0.1872 |
0.44 | 96.0 | 1344 | 0.6053 | 0.3755 | 0.1884 |
0.44 | 97.0 | 1358 | 0.5858 | 0.3718 | 0.1863 |
0.44 | 98.0 | 1372 | 0.5933 | 0.3736 | 0.1869 |
0.44 | 99.0 | 1386 | 0.5861 | 0.3722 | 0.1859 |
0.3835 | 100.0 | 1400 | 0.5969 | 0.3742 | 0.1872 |
0.3835 | 101.0 | 1414 | 0.5779 | 0.3681 | 0.1856 |
0.3835 | 102.0 | 1428 | 0.5938 | 0.3732 | 0.1872 |
0.3835 | 103.0 | 1442 | 0.5759 | 0.3663 | 0.1850 |
0.3835 | 104.0 | 1456 | 0.5893 | 0.3714 | 0.1876 |
0.3835 | 105.0 | 1470 | 0.5816 | 0.3665 | 0.1857 |
0.3835 | 106.0 | 1484 | 0.5775 | 0.3659 | 0.1860 |
0.3835 | 107.0 | 1498 | 0.5809 | 0.3704 | 0.1868 |
0.3613 | 108.0 | 1512 | 0.5722 | 0.3635 | 0.1854 |
0.3613 | 109.0 | 1526 | 0.5721 | 0.3623 | 0.1847 |
0.3613 | 110.0 | 1540 | 0.5774 | 0.3610 | 0.1847 |
0.3613 | 111.0 | 1554 | 0.5723 | 0.3631 | 0.1842 |
0.3613 | 112.0 | 1568 | 0.5748 | 0.3588 | 0.1837 |
0.3613 | 113.0 | 1582 | 0.5801 | 0.3623 | 0.1841 |
0.3613 | 114.0 | 1596 | 0.5773 | 0.3614 | 0.1838 |
0.3396 | 115.0 | 1610 | 0.5742 | 0.3623 | 0.1845 |
0.3396 | 116.0 | 1624 | 0.5832 | 0.3604 | 0.1848 |
0.3396 | 117.0 | 1638 | 0.5818 | 0.3592 | 0.1852 |
0.3396 | 118.0 | 1652 | 0.5700 | 0.3560 | 0.1836 |
0.3396 | 119.0 | 1666 | 0.5796 | 0.3608 | 0.1846 |
0.3396 | 120.0 | 1680 | 0.5706 | 0.3578 | 0.1837 |
0.3396 | 121.0 | 1694 | 0.5750 | 0.3584 | 0.1842 |
0.3327 | 122.0 | 1708 | 0.5764 | 0.3580 | 0.1842 |
0.3327 | 123.0 | 1722 | 0.5690 | 0.3551 | 0.1834 |
0.3327 | 124.0 | 1736 | 0.5587 | 0.3549 | 0.1821 |
0.3327 | 125.0 | 1750 | 0.5637 | 0.3543 | 0.1827 |
0.3327 | 126.0 | 1764 | 0.5634 | 0.3543 | 0.1823 |
0.3327 | 127.0 | 1778 | 0.5625 | 0.3531 | 0.1817 |
0.3327 | 128.0 | 1792 | 0.5737 | 0.3545 | 0.1826 |
0.3237 | 129.0 | 1806 | 0.5653 | 0.3539 | 0.1817 |
0.3237 | 130.0 | 1820 | 0.5671 | 0.3545 | 0.1824 |
0.3237 | 131.0 | 1834 | 0.5711 | 0.3549 | 0.1823 |
0.3237 | 132.0 | 1848 | 0.5682 | 0.3533 | 0.1819 |
0.3237 | 133.0 | 1862 | 0.5685 | 0.3545 | 0.1829 |
0.3237 | 134.0 | 1876 | 0.5662 | 0.3539 | 0.1826 |
0.3237 | 135.0 | 1890 | 0.5706 | 0.3535 | 0.1827 |
0.3038 | 136.0 | 1904 | 0.5678 | 0.3539 | 0.1821 |
0.3038 | 137.0 | 1918 | 0.5648 | 0.3543 | 0.1823 |
0.3038 | 138.0 | 1932 | 0.5638 | 0.3539 | 0.1819 |
0.3038 | 139.0 | 1946 | 0.5689 | 0.3541 | 0.1823 |
0.3038 | 140.0 | 1960 | 0.5710 | 0.3541 | 0.1825 |
0.3038 | 141.0 | 1974 | 0.5648 | 0.3533 | 0.1821 |
0.3038 | 142.0 | 1988 | 0.5656 | 0.3533 | 0.1820 |
0.2933 | 143.0 | 2002 | 0.5654 | 0.3539 | 0.1825 |
0.2933 | 144.0 | 2016 | 0.5667 | 0.3529 | 0.1828 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.13.3
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.