wav2vec2-large-xlsr-coraa-exp-2
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7598
- Wer: 0.4437
- Cer: 0.2059
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
37.4939 | 1.0 | 14 | 27.8472 | 1.0 | 0.9612 |
37.4939 | 2.0 | 28 | 7.8236 | 1.0 | 0.9619 |
37.4939 | 3.0 | 42 | 4.5483 | 1.0 | 0.9619 |
37.4939 | 4.0 | 56 | 3.9455 | 1.0 | 0.9619 |
37.4939 | 5.0 | 70 | 3.6981 | 1.0 | 0.9619 |
37.4939 | 6.0 | 84 | 3.5551 | 1.0 | 0.9619 |
37.4939 | 7.0 | 98 | 3.4345 | 1.0 | 0.9619 |
9.4755 | 8.0 | 112 | 3.3129 | 1.0 | 0.9619 |
9.4755 | 9.0 | 126 | 3.2352 | 1.0 | 0.9619 |
9.4755 | 10.0 | 140 | 3.1795 | 1.0 | 0.9619 |
9.4755 | 11.0 | 154 | 3.1330 | 1.0 | 0.9619 |
9.4755 | 12.0 | 168 | 3.1329 | 1.0 | 0.9619 |
9.4755 | 13.0 | 182 | 3.0781 | 1.0 | 0.9619 |
9.4755 | 14.0 | 196 | 3.0682 | 1.0 | 0.9619 |
3.0804 | 15.0 | 210 | 3.0734 | 1.0 | 0.9619 |
3.0804 | 16.0 | 224 | 3.0565 | 1.0 | 0.9619 |
3.0804 | 17.0 | 238 | 3.0534 | 1.0 | 0.9619 |
3.0804 | 18.0 | 252 | 3.0327 | 1.0 | 0.9619 |
3.0804 | 19.0 | 266 | 3.0382 | 1.0 | 0.9619 |
3.0804 | 20.0 | 280 | 3.0320 | 1.0 | 0.9619 |
3.0804 | 21.0 | 294 | 3.0175 | 1.0 | 0.9619 |
2.9645 | 22.0 | 308 | 3.0184 | 1.0 | 0.9619 |
2.9645 | 23.0 | 322 | 3.0053 | 1.0 | 0.9619 |
2.9645 | 24.0 | 336 | 3.0082 | 1.0 | 0.9619 |
2.9645 | 25.0 | 350 | 3.0016 | 1.0 | 0.9619 |
2.9645 | 26.0 | 364 | 3.0093 | 1.0 | 0.9619 |
2.9645 | 27.0 | 378 | 2.9993 | 1.0 | 0.9619 |
2.9645 | 28.0 | 392 | 2.9998 | 1.0 | 0.9619 |
2.9326 | 29.0 | 406 | 3.0039 | 1.0 | 0.9619 |
2.9326 | 30.0 | 420 | 3.0013 | 1.0 | 0.9619 |
2.9326 | 31.0 | 434 | 2.9991 | 1.0 | 0.9619 |
2.9326 | 32.0 | 448 | 2.9949 | 1.0 | 0.9619 |
2.9326 | 33.0 | 462 | 2.9920 | 1.0 | 0.9619 |
2.9326 | 34.0 | 476 | 2.9980 | 1.0 | 0.9619 |
2.9326 | 35.0 | 490 | 2.9905 | 1.0 | 0.9619 |
2.9228 | 36.0 | 504 | 2.9935 | 1.0 | 0.9619 |
2.9228 | 37.0 | 518 | 2.9859 | 1.0 | 0.9619 |
2.9228 | 38.0 | 532 | 2.9879 | 1.0 | 0.9619 |
2.9228 | 39.0 | 546 | 2.9838 | 1.0 | 0.9619 |
2.9228 | 40.0 | 560 | 2.9819 | 1.0 | 0.9619 |
2.9228 | 41.0 | 574 | 2.9832 | 1.0 | 0.9619 |
2.9228 | 42.0 | 588 | 2.9748 | 1.0 | 0.9619 |
2.9107 | 43.0 | 602 | 2.9705 | 1.0 | 0.9616 |
2.9107 | 44.0 | 616 | 2.9658 | 1.0 | 0.9591 |
2.9107 | 45.0 | 630 | 2.9676 | 1.0 | 0.9580 |
2.9107 | 46.0 | 644 | 2.9602 | 1.0 | 0.9617 |
2.9107 | 47.0 | 658 | 2.9288 | 1.0 | 0.9581 |
2.9107 | 48.0 | 672 | 2.9023 | 1.0 | 0.9564 |
2.9107 | 49.0 | 686 | 2.8598 | 1.0 | 0.9601 |
2.8675 | 50.0 | 700 | 2.8022 | 1.0 | 0.9617 |
2.8675 | 51.0 | 714 | 2.7749 | 1.0 | 0.9559 |
2.8675 | 52.0 | 728 | 2.7368 | 1.0 | 0.9614 |
2.8675 | 53.0 | 742 | 2.6779 | 1.0 | 0.9597 |
2.8675 | 54.0 | 756 | 2.6466 | 1.0 | 0.9513 |
2.8675 | 55.0 | 770 | 2.6083 | 1.0 | 0.9381 |
2.8675 | 56.0 | 784 | 2.5400 | 1.0 | 0.9046 |
2.8675 | 57.0 | 798 | 2.4022 | 1.0 | 0.8121 |
2.6433 | 58.0 | 812 | 2.2463 | 1.0 | 0.7267 |
2.6433 | 59.0 | 826 | 2.0689 | 1.0 | 0.6162 |
2.6433 | 60.0 | 840 | 1.9145 | 1.0 | 0.5704 |
2.6433 | 61.0 | 854 | 1.7756 | 1.0 | 0.5095 |
2.6433 | 62.0 | 868 | 1.6238 | 1.0 | 0.4700 |
2.6433 | 63.0 | 882 | 1.4970 | 1.0 | 0.4454 |
2.6433 | 64.0 | 896 | 1.4010 | 1.0 | 0.4264 |
2.0023 | 65.0 | 910 | 1.3292 | 1.0 | 0.4142 |
2.0023 | 66.0 | 924 | 1.2790 | 0.9996 | 0.4043 |
2.0023 | 67.0 | 938 | 1.2129 | 0.9972 | 0.3898 |
2.0023 | 68.0 | 952 | 1.1590 | 0.9937 | 0.3795 |
2.0023 | 69.0 | 966 | 1.1193 | 0.9793 | 0.3618 |
2.0023 | 70.0 | 980 | 1.0872 | 0.9567 | 0.3482 |
2.0023 | 71.0 | 994 | 1.0603 | 0.9025 | 0.3278 |
1.3694 | 72.0 | 1008 | 1.0181 | 0.8694 | 0.3145 |
1.3694 | 73.0 | 1022 | 0.9941 | 0.8249 | 0.3000 |
1.3694 | 74.0 | 1036 | 0.9689 | 0.7082 | 0.2688 |
1.3694 | 75.0 | 1050 | 0.9346 | 0.6274 | 0.2466 |
1.3694 | 76.0 | 1064 | 0.9144 | 0.5603 | 0.2331 |
1.3694 | 77.0 | 1078 | 0.8997 | 0.5238 | 0.2253 |
1.3694 | 78.0 | 1092 | 0.8630 | 0.5154 | 0.2227 |
1.0996 | 79.0 | 1106 | 0.8602 | 0.4974 | 0.2171 |
1.0996 | 80.0 | 1120 | 0.8447 | 0.4911 | 0.2166 |
1.0996 | 81.0 | 1134 | 0.8308 | 0.4890 | 0.2164 |
1.0996 | 82.0 | 1148 | 0.8408 | 0.4781 | 0.2146 |
1.0996 | 83.0 | 1162 | 0.8201 | 0.4732 | 0.2128 |
1.0996 | 84.0 | 1176 | 0.8140 | 0.4711 | 0.2120 |
1.0996 | 85.0 | 1190 | 0.8041 | 0.4655 | 0.2105 |
0.9419 | 86.0 | 1204 | 0.7987 | 0.4616 | 0.2104 |
0.9419 | 87.0 | 1218 | 0.7883 | 0.4588 | 0.2092 |
0.9419 | 88.0 | 1232 | 0.7889 | 0.4581 | 0.2093 |
0.9419 | 89.0 | 1246 | 0.7859 | 0.4553 | 0.2088 |
0.9419 | 90.0 | 1260 | 0.7796 | 0.4508 | 0.2079 |
0.9419 | 91.0 | 1274 | 0.7814 | 0.4496 | 0.2081 |
0.9419 | 92.0 | 1288 | 0.7753 | 0.4482 | 0.2079 |
0.8707 | 93.0 | 1302 | 0.7674 | 0.4480 | 0.2068 |
0.8707 | 94.0 | 1316 | 0.7664 | 0.4474 | 0.2065 |
0.8707 | 95.0 | 1330 | 0.7612 | 0.4478 | 0.2063 |
0.8707 | 96.0 | 1344 | 0.7640 | 0.4441 | 0.2062 |
0.8707 | 97.0 | 1358 | 0.7598 | 0.4437 | 0.2059 |
0.8707 | 98.0 | 1372 | 0.7621 | 0.4427 | 0.2058 |
0.8707 | 99.0 | 1386 | 0.7634 | 0.4427 | 0.2059 |
0.8144 | 100.0 | 1400 | 0.7631 | 0.4425 | 0.2059 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.13.3
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.