You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-lg-CV-Fleurs-1hrs-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0197
  • Wer: 0.8886
  • Cer: 0.2524

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
6.8275 1.0 33 3.1073 1.0 1.0
3.088 2.0 66 3.0379 1.0 1.0
3.0336 3.0 99 2.9752 1.0 1.0
3.0781 4.0 132 2.9620 1.0 1.0
3.0093 5.0 165 2.9478 1.0 1.0
2.9958 6.0 198 2.9473 1.0 1.0
2.9799 7.0 231 2.9684 1.0 1.0
2.9428 8.0 264 2.9958 1.0 1.0
2.9422 9.0 297 2.9387 1.0 1.0
2.9239 10.0 330 2.9286 1.0 1.0
2.9181 11.0 363 2.9294 1.0 1.0
2.9197 12.0 396 2.9307 1.0 1.0
2.9163 13.0 429 2.9138 1.0 1.0
2.897 14.0 462 2.8939 1.0 1.0
2.8723 15.0 495 2.8486 1.0 1.0
2.8002 16.0 528 2.6845 1.0 1.0
2.6696 17.0 561 2.5484 1.0 1.0
2.5371 18.0 594 2.4687 1.0 0.9557
2.3917 19.0 627 2.2644 1.0 0.8387
2.1942 20.0 660 2.0113 0.9999 0.7568
1.9233 21.0 693 1.7627 0.9999 0.5527
1.713 22.0 726 1.6223 0.9999 0.4931
1.5714 23.0 759 1.5463 0.9974 0.4636
1.4371 24.0 792 1.5468 0.9925 0.4138
1.3572 25.0 825 1.4155 0.9930 0.3899
1.2358 26.0 858 1.4144 0.9944 0.3711
1.1006 27.0 891 1.3660 0.9654 0.3454
1.0414 28.0 924 1.3686 0.9702 0.3455
0.9343 29.0 957 1.3086 0.9647 0.3270
0.8491 30.0 990 1.2775 0.9558 0.3095
0.8 31.0 1023 1.3166 0.9393 0.3029
0.7535 32.0 1056 1.3323 0.9552 0.3161
0.6824 33.0 1089 1.3247 0.9505 0.3028
0.6343 34.0 1122 1.3871 0.9408 0.3001
0.5869 35.0 1155 1.3804 0.9309 0.2913
0.54 36.0 1188 1.4685 0.9371 0.2973
0.5179 37.0 1221 1.5153 0.9322 0.2919
0.5097 38.0 1254 1.5292 0.9360 0.2880
0.4792 39.0 1287 1.4690 0.9551 0.2907
0.4502 40.0 1320 1.5556 0.9522 0.2931
0.4525 41.0 1353 1.5850 0.9356 0.2906
0.4032 42.0 1386 1.6135 0.9317 0.2811
0.4262 43.0 1419 1.4706 0.9355 0.2823
0.3563 44.0 1452 1.5744 0.9309 0.2815
0.3382 45.0 1485 1.5696 0.9327 0.2779
0.3448 46.0 1518 1.6642 0.9558 0.2792
0.3516 47.0 1551 1.6486 0.9531 0.2823
0.2998 48.0 1584 1.6531 0.9316 0.2766
0.2734 49.0 1617 1.7302 0.9403 0.2802
0.3185 50.0 1650 1.6439 0.9165 0.2735
0.2747 51.0 1683 1.6629 0.9340 0.2735
0.3124 52.0 1716 1.7290 0.9232 0.2759
0.2756 53.0 1749 1.6638 0.9066 0.2692
0.2431 54.0 1782 1.7213 0.9270 0.2723
0.2334 55.0 1815 1.8311 0.9193 0.2690
0.2489 56.0 1848 1.8231 0.9137 0.2724
0.2378 57.0 1881 1.8004 0.9229 0.2676
0.2189 58.0 1914 1.8910 0.9351 0.2715
0.2405 59.0 1947 1.9310 0.9827 0.2756
0.2283 60.0 1980 1.8233 0.9058 0.2664
0.2466 61.0 2013 1.8149 0.9227 0.2700
0.2347 62.0 2046 1.8027 0.9147 0.2653
0.2206 63.0 2079 1.8181 0.9058 0.2633
0.2209 64.0 2112 1.8466 0.9025 0.2630
0.1965 65.0 2145 1.8737 0.9198 0.2659
0.2045 66.0 2178 1.8682 0.9011 0.2636
0.1974 67.0 2211 1.8941 0.9069 0.2661
0.202 68.0 2244 1.8658 0.9041 0.2621
0.178 69.0 2277 1.9012 0.9280 0.2658
0.1968 70.0 2310 1.8812 0.9114 0.2630
0.1837 71.0 2343 1.8900 0.9131 0.2636
0.1743 72.0 2376 1.9226 0.9032 0.2628
0.1946 73.0 2409 1.8792 0.9011 0.2608
0.1764 74.0 2442 1.9741 0.8991 0.2598
0.1786 75.0 2475 1.9462 0.9031 0.2613
0.1712 76.0 2508 1.9271 0.8981 0.2587
0.1519 77.0 2541 1.9549 0.8978 0.2603
0.1665 78.0 2574 1.8921 0.9081 0.2568
0.1804 79.0 2607 1.9477 0.8968 0.2563
0.1857 80.0 2640 1.9150 0.9036 0.2550
0.1726 81.0 2673 1.9076 0.9022 0.2553
0.1603 82.0 2706 1.9458 0.8974 0.2562
0.162 83.0 2739 1.9707 0.8896 0.2555
0.1622 84.0 2772 1.9842 0.8917 0.2561
0.1403 85.0 2805 1.9844 0.8938 0.2555
0.1671 86.0 2838 1.9455 0.8892 0.2546
0.1518 87.0 2871 1.9507 0.8937 0.2551
0.1552 88.0 2904 1.9507 0.8902 0.2539
0.1569 89.0 2937 1.9410 0.8903 0.2530
0.1579 90.0 2970 1.9786 0.8840 0.2522
0.1451 91.0 3003 2.0066 0.8916 0.2540
0.1482 92.0 3036 2.0054 0.8882 0.2536
0.1498 93.0 3069 2.0093 0.8900 0.2531
0.1444 94.0 3102 2.0054 0.8895 0.2531
0.139 95.0 3135 1.9994 0.8910 0.2533
0.1571 96.0 3168 2.0042 0.8887 0.2524
0.1478 97.0 3201 2.0088 0.8884 0.2526
0.1537 98.0 3234 2.0127 0.8860 0.2523
0.1379 99.0 3267 2.0143 0.8880 0.2527
0.1401 100.0 3300 2.0197 0.8886 0.2524

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
22
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-1hrs-v1

Finetuned
(559)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-1hrs-v1