wav2vec2-large-xlsr-mecita-coraa-portuguese-random-all-02
This model is a fine-tuned version of jonatasgrosman/wav2vec2-xls-r-1b-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2941
- Wer: 0.1192
- Cer: 0.0385
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
26.6237 | 1.0 | 86 | 2.7490 | 0.9342 | 0.7322 |
5.3366 | 2.0 | 172 | 1.6394 | 0.7008 | 0.4202 |
3.393 | 3.0 | 258 | 1.0152 | 0.4928 | 0.2308 |
2.1045 | 4.0 | 344 | 0.6938 | 0.3511 | 0.1511 |
1.8687 | 5.0 | 430 | 0.5804 | 0.2975 | 0.1133 |
1.0808 | 6.0 | 516 | 0.4939 | 0.2553 | 0.0986 |
0.9453 | 7.0 | 602 | 0.4599 | 0.2317 | 0.0858 |
0.9453 | 8.0 | 688 | 0.4172 | 0.2139 | 0.0739 |
0.7718 | 9.0 | 774 | 0.3793 | 0.1807 | 0.0622 |
0.72 | 10.0 | 860 | 0.4070 | 0.1814 | 0.0636 |
0.4718 | 11.0 | 946 | 0.4255 | 0.1644 | 0.0570 |
0.5514 | 12.0 | 1032 | 0.3529 | 0.1526 | 0.0511 |
0.3473 | 13.0 | 1118 | 0.3458 | 0.1529 | 0.0505 |
0.4885 | 14.0 | 1204 | 0.3217 | 0.1444 | 0.0496 |
0.4885 | 15.0 | 1290 | 0.3107 | 0.1448 | 0.0480 |
0.3774 | 16.0 | 1376 | 0.4008 | 0.1461 | 0.0488 |
0.3974 | 17.0 | 1462 | 0.3857 | 0.1519 | 0.0501 |
0.3398 | 18.0 | 1548 | 0.3497 | 0.1309 | 0.0422 |
0.2741 | 19.0 | 1634 | 0.3150 | 0.1312 | 0.0426 |
0.2805 | 20.0 | 1720 | 0.3533 | 0.1236 | 0.0405 |
0.3292 | 21.0 | 1806 | 0.3227 | 0.1278 | 0.0424 |
0.3292 | 22.0 | 1892 | 0.2969 | 0.1295 | 0.0416 |
0.2255 | 23.0 | 1978 | 0.2941 | 0.1192 | 0.0385 |
0.2107 | 24.0 | 2064 | 0.3290 | 0.1261 | 0.0421 |
0.1922 | 25.0 | 2150 | 0.3492 | 0.1222 | 0.0399 |
0.1829 | 26.0 | 2236 | 0.3640 | 0.1173 | 0.0383 |
0.1911 | 27.0 | 2322 | 0.3595 | 0.1224 | 0.0396 |
0.1712 | 28.0 | 2408 | 0.3521 | 0.1219 | 0.0390 |
0.1712 | 29.0 | 2494 | 0.3313 | 0.1136 | 0.0374 |
0.1708 | 30.0 | 2580 | 0.3219 | 0.1207 | 0.0381 |
0.1389 | 31.0 | 2666 | 0.3261 | 0.1114 | 0.0361 |
0.1516 | 32.0 | 2752 | 0.3446 | 0.1102 | 0.0359 |
0.2601 | 33.0 | 2838 | 0.3505 | 0.1151 | 0.0367 |
0.1392 | 34.0 | 2924 | 0.3282 | 0.1131 | 0.0367 |
0.1286 | 35.0 | 3010 | 0.3351 | 0.1129 | 0.0359 |
0.1286 | 36.0 | 3096 | 0.3482 | 0.1119 | 0.0357 |
0.1497 | 37.0 | 3182 | 0.3762 | 0.1156 | 0.0373 |
0.1319 | 38.0 | 3268 | 0.3733 | 0.1149 | 0.0373 |
0.1294 | 39.0 | 3354 | 0.3463 | 0.1178 | 0.0372 |
0.1459 | 40.0 | 3440 | 0.3440 | 0.1146 | 0.0378 |
0.0998 | 41.0 | 3526 | 0.3467 | 0.1131 | 0.0366 |
0.1036 | 42.0 | 3612 | 0.3225 | 0.1127 | 0.0362 |
0.1036 | 43.0 | 3698 | 0.3630 | 0.1105 | 0.0351 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.13.3
- Downloads last month
- 1
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.