wav2vec2-large-xlsr-coraa-exp-1
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5564
- Wer: 0.3555
- Cer: 0.1821
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
39.0927 | 1.0 | 14 | 30.0708 | 1.0 | 0.9613 |
39.0927 | 2.0 | 28 | 9.0585 | 1.0 | 0.9619 |
39.0927 | 3.0 | 42 | 4.9408 | 1.0 | 0.9619 |
39.0927 | 4.0 | 56 | 4.1247 | 1.0 | 0.9619 |
39.0927 | 5.0 | 70 | 3.8053 | 1.0 | 0.9619 |
39.0927 | 6.0 | 84 | 3.6257 | 1.0 | 0.9619 |
39.0927 | 7.0 | 98 | 3.4882 | 1.0 | 0.9619 |
10.1374 | 8.0 | 112 | 3.3658 | 1.0 | 0.9619 |
10.1374 | 9.0 | 126 | 3.2659 | 1.0 | 0.9619 |
10.1374 | 10.0 | 140 | 3.1858 | 1.0 | 0.9619 |
10.1374 | 11.0 | 154 | 3.1509 | 1.0 | 0.9619 |
10.1374 | 12.0 | 168 | 3.1087 | 1.0 | 0.9619 |
10.1374 | 13.0 | 182 | 3.0901 | 1.0 | 0.9619 |
10.1374 | 14.0 | 196 | 3.0563 | 1.0 | 0.9619 |
3.0936 | 15.0 | 210 | 3.0725 | 1.0 | 0.9619 |
3.0936 | 16.0 | 224 | 3.0693 | 1.0 | 0.9619 |
3.0936 | 17.0 | 238 | 3.0401 | 1.0 | 0.9619 |
3.0936 | 18.0 | 252 | 3.0363 | 1.0 | 0.9619 |
3.0936 | 19.0 | 266 | 3.0399 | 1.0 | 0.9619 |
3.0936 | 20.0 | 280 | 3.0229 | 1.0 | 0.9619 |
3.0936 | 21.0 | 294 | 3.0135 | 1.0 | 0.9619 |
2.9599 | 22.0 | 308 | 3.0165 | 1.0 | 0.9619 |
2.9599 | 23.0 | 322 | 3.0048 | 1.0 | 0.9619 |
2.9599 | 24.0 | 336 | 3.0040 | 1.0 | 0.9619 |
2.9599 | 25.0 | 350 | 2.9981 | 1.0 | 0.9619 |
2.9599 | 26.0 | 364 | 3.0019 | 1.0 | 0.9619 |
2.9599 | 27.0 | 378 | 2.9937 | 1.0 | 0.9619 |
2.9599 | 28.0 | 392 | 2.9919 | 1.0 | 0.9619 |
2.9236 | 29.0 | 406 | 2.9904 | 1.0 | 0.9619 |
2.9236 | 30.0 | 420 | 2.9829 | 1.0 | 0.9619 |
2.9236 | 31.0 | 434 | 2.9772 | 1.0 | 0.9619 |
2.9236 | 32.0 | 448 | 2.9296 | 1.0 | 0.9619 |
2.9236 | 33.0 | 462 | 2.9011 | 1.0 | 0.9619 |
2.9236 | 34.0 | 476 | 2.8410 | 1.0 | 0.9619 |
2.9236 | 35.0 | 490 | 2.7730 | 1.0 | 0.9619 |
2.8623 | 36.0 | 504 | 2.7076 | 1.0 | 0.9618 |
2.8623 | 37.0 | 518 | 2.6268 | 1.0 | 0.9590 |
2.8623 | 38.0 | 532 | 2.4435 | 1.0 | 0.8387 |
2.8623 | 39.0 | 546 | 2.1666 | 1.0 | 0.6899 |
2.8623 | 40.0 | 560 | 1.8675 | 1.0 | 0.5183 |
2.8623 | 41.0 | 574 | 1.5864 | 1.0 | 0.4389 |
2.8623 | 42.0 | 588 | 1.4232 | 0.9996 | 0.3989 |
2.1779 | 43.0 | 602 | 1.2338 | 0.9976 | 0.3828 |
2.1779 | 44.0 | 616 | 1.1252 | 0.9431 | 0.3392 |
2.1779 | 45.0 | 630 | 1.0542 | 0.7505 | 0.2762 |
2.1779 | 46.0 | 644 | 0.9471 | 0.5510 | 0.2287 |
2.1779 | 47.0 | 658 | 0.8948 | 0.5154 | 0.2230 |
2.1779 | 48.0 | 672 | 0.8252 | 0.5055 | 0.2195 |
2.1779 | 49.0 | 686 | 0.7892 | 0.4691 | 0.2086 |
1.0861 | 50.0 | 700 | 0.7734 | 0.4464 | 0.2053 |
1.0861 | 51.0 | 714 | 0.7450 | 0.4466 | 0.2057 |
1.0861 | 52.0 | 728 | 0.7445 | 0.4421 | 0.2054 |
1.0861 | 53.0 | 742 | 0.7073 | 0.4291 | 0.2007 |
1.0861 | 54.0 | 756 | 0.7187 | 0.4279 | 0.2016 |
1.0861 | 55.0 | 770 | 0.7030 | 0.4185 | 0.1996 |
1.0861 | 56.0 | 784 | 0.6911 | 0.4130 | 0.1973 |
1.0861 | 57.0 | 798 | 0.6678 | 0.4055 | 0.1953 |
0.715 | 58.0 | 812 | 0.6554 | 0.4072 | 0.1947 |
0.715 | 59.0 | 826 | 0.6637 | 0.4110 | 0.1960 |
0.715 | 60.0 | 840 | 0.6606 | 0.4037 | 0.1962 |
0.715 | 61.0 | 854 | 0.6598 | 0.4069 | 0.1969 |
0.715 | 62.0 | 868 | 0.6365 | 0.4023 | 0.1946 |
0.715 | 63.0 | 882 | 0.6275 | 0.3937 | 0.1928 |
0.715 | 64.0 | 896 | 0.6460 | 0.3925 | 0.1941 |
0.5672 | 65.0 | 910 | 0.6349 | 0.3939 | 0.1945 |
0.5672 | 66.0 | 924 | 0.6282 | 0.3933 | 0.1938 |
0.5672 | 67.0 | 938 | 0.6014 | 0.3872 | 0.1901 |
0.5672 | 68.0 | 952 | 0.6073 | 0.3854 | 0.1899 |
0.5672 | 69.0 | 966 | 0.6144 | 0.3862 | 0.1914 |
0.5672 | 70.0 | 980 | 0.6038 | 0.3860 | 0.1912 |
0.5672 | 71.0 | 994 | 0.6110 | 0.3836 | 0.1916 |
0.4622 | 72.0 | 1008 | 0.6022 | 0.3781 | 0.1891 |
0.4622 | 73.0 | 1022 | 0.5961 | 0.3775 | 0.1890 |
0.4622 | 74.0 | 1036 | 0.5991 | 0.3753 | 0.1885 |
0.4622 | 75.0 | 1050 | 0.5966 | 0.3732 | 0.1887 |
0.4622 | 76.0 | 1064 | 0.5963 | 0.3785 | 0.1897 |
0.4622 | 77.0 | 1078 | 0.5902 | 0.3816 | 0.1896 |
0.4622 | 78.0 | 1092 | 0.5695 | 0.3738 | 0.1864 |
0.4311 | 79.0 | 1106 | 0.5828 | 0.3765 | 0.1869 |
0.4311 | 80.0 | 1120 | 0.5799 | 0.3748 | 0.1871 |
0.4311 | 81.0 | 1134 | 0.5753 | 0.3746 | 0.1874 |
0.4311 | 82.0 | 1148 | 0.5795 | 0.3738 | 0.1876 |
0.4311 | 83.0 | 1162 | 0.5899 | 0.3726 | 0.1884 |
0.4311 | 84.0 | 1176 | 0.5791 | 0.3671 | 0.1864 |
0.4311 | 85.0 | 1190 | 0.5711 | 0.3649 | 0.1850 |
0.3905 | 86.0 | 1204 | 0.5771 | 0.3692 | 0.1857 |
0.3905 | 87.0 | 1218 | 0.5769 | 0.3657 | 0.1850 |
0.3905 | 88.0 | 1232 | 0.5681 | 0.3663 | 0.1846 |
0.3905 | 89.0 | 1246 | 0.5772 | 0.3653 | 0.1846 |
0.3905 | 90.0 | 1260 | 0.5658 | 0.3623 | 0.1835 |
0.3905 | 91.0 | 1274 | 0.5706 | 0.3653 | 0.1853 |
0.3905 | 92.0 | 1288 | 0.5735 | 0.3600 | 0.1838 |
0.3626 | 93.0 | 1302 | 0.5607 | 0.3598 | 0.1833 |
0.3626 | 94.0 | 1316 | 0.5736 | 0.3610 | 0.1839 |
0.3626 | 95.0 | 1330 | 0.5701 | 0.3604 | 0.1847 |
0.3626 | 96.0 | 1344 | 0.5775 | 0.3637 | 0.1856 |
0.3626 | 97.0 | 1358 | 0.5564 | 0.3555 | 0.1821 |
0.3626 | 98.0 | 1372 | 0.5770 | 0.3580 | 0.1839 |
0.3626 | 99.0 | 1386 | 0.5692 | 0.3584 | 0.1831 |
0.3218 | 100.0 | 1400 | 0.5748 | 0.3582 | 0.1831 |
0.3218 | 101.0 | 1414 | 0.5647 | 0.3553 | 0.1822 |
0.3218 | 102.0 | 1428 | 0.5756 | 0.3584 | 0.1831 |
0.3218 | 103.0 | 1442 | 0.5739 | 0.3590 | 0.1833 |
0.3218 | 104.0 | 1456 | 0.5663 | 0.3586 | 0.1828 |
0.3218 | 105.0 | 1470 | 0.5631 | 0.3602 | 0.1829 |
0.3218 | 106.0 | 1484 | 0.5747 | 0.3616 | 0.1838 |
0.3218 | 107.0 | 1498 | 0.5691 | 0.3590 | 0.1838 |
0.3032 | 108.0 | 1512 | 0.5573 | 0.3582 | 0.1829 |
0.3032 | 109.0 | 1526 | 0.5605 | 0.3570 | 0.1834 |
0.3032 | 110.0 | 1540 | 0.5719 | 0.3568 | 0.1838 |
0.3032 | 111.0 | 1554 | 0.5595 | 0.3568 | 0.1826 |
0.3032 | 112.0 | 1568 | 0.5614 | 0.3570 | 0.1825 |
0.3032 | 113.0 | 1582 | 0.5676 | 0.3566 | 0.1832 |
0.3032 | 114.0 | 1596 | 0.5715 | 0.3572 | 0.1834 |
0.2957 | 115.0 | 1610 | 0.5735 | 0.3584 | 0.1831 |
0.2957 | 116.0 | 1624 | 0.5706 | 0.3588 | 0.1833 |
0.2957 | 117.0 | 1638 | 0.5708 | 0.3551 | 0.1828 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.13.3
- Downloads last month
- 1
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.