xls-r-asr_xh-run6
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the asr_xh dataset. It achieves the following results:
- Wer (Validation): 53.06%
- Wer (Test): 53.88%
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer (Train) |
---|---|---|---|---|
10.217 | 0.64 | 100 | 4.3372 | 1.0 |
3.5916 | 1.27 | 200 | 3.0758 | 1.0 |
3.04 | 1.91 | 300 | 2.8524 | 1.0 |
1.8074 | 2.55 | 400 | 0.8240 | 0.8711 |
0.7939 | 3.18 | 500 | 0.6766 | 0.8073 |
0.6324 | 3.82 | 600 | 0.5092 | 0.6501 |
0.4725 | 4.46 | 700 | 0.5041 | 0.6139 |
0.4319 | 5.1 | 800 | 0.4249 | 0.5892 |
0.3264 | 5.73 | 900 | 0.4713 | 0.5905 |
0.3122 | 6.37 | 1000 | 0.4220 | 0.5513 |
0.2631 | 7.01 | 1100 | 0.4722 | 0.5659 |
0.2203 | 7.64 | 1200 | 0.4176 | 0.5306 |
0.2113 | 8.28 | 1300 | 0.4374 | 0.5468 |
0.1885 | 8.92 | 1400 | 0.4484 | 0.5021 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.