metadata
language:
- kz
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
- hf-asr-leaderboard
- generated_from_trainer
datasets:
- ISSAI_KSC2
metrics:
- wer
model-index:
- name: Kammi
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: BilalS96/ISSAI_KSC2
type: ISSAI_KSC2
args: 'config: kzk, split: test'
metrics:
- name: Wer
type: wer
value: 0.38223702730599607
Kammi
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the BilalS96/ISSAI_KSC2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.6797
- Wer: 0.3822
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
5.2923 | 0.4278 | 400 | 4.4181 | 1.0 |
3.3189 | 0.8556 | 800 | 3.6105 | 1.0 |
3.2115 | 1.2834 | 1200 | 3.3715 | 1.0 |
3.148 | 1.7112 | 1600 | 3.1163 | 1.0 |
3.0788 | 2.1390 | 2000 | 3.2185 | 1.0 |
2.9677 | 2.5668 | 2400 | 2.7724 | 1.0000 |
2.3283 | 2.9947 | 2800 | 1.7294 | 0.9985 |
1.6653 | 3.4225 | 3200 | 1.3565 | 0.9627 |
1.4308 | 3.8503 | 3600 | 1.1434 | 0.9235 |
1.2196 | 4.2781 | 4000 | 0.9823 | 0.8583 |
1.0644 | 4.7059 | 4400 | 0.8573 | 0.8191 |
0.9649 | 5.1337 | 4800 | 0.8064 | 0.7725 |
0.849 | 5.5615 | 5200 | 0.7391 | 0.7389 |
0.8208 | 5.9893 | 5600 | 0.7014 | 0.6868 |
0.6995 | 6.4171 | 6000 | 0.6765 | 0.6687 |
0.703 | 6.8449 | 6400 | 0.6347 | 0.6476 |
0.6136 | 7.2727 | 6800 | 0.6371 | 0.6226 |
0.5957 | 7.7005 | 7200 | 0.6068 | 0.6000 |
0.5616 | 8.1283 | 7600 | 0.5877 | 0.5774 |
0.5128 | 8.5561 | 8000 | 0.5878 | 0.5605 |
0.5093 | 8.9840 | 8400 | 0.5502 | 0.5469 |
0.4544 | 9.4118 | 8800 | 0.5823 | 0.5424 |
0.4622 | 9.8396 | 9200 | 0.5546 | 0.5219 |
0.424 | 10.2674 | 9600 | 0.5910 | 0.5247 |
0.4041 | 10.6952 | 10000 | 0.5735 | 0.5130 |
0.3956 | 11.1230 | 10400 | 0.5673 | 0.5005 |
0.3694 | 11.5508 | 10800 | 0.5336 | 0.4940 |
0.3675 | 11.9786 | 11200 | 0.5304 | 0.4886 |
0.338 | 12.4064 | 11600 | 0.6132 | 0.4859 |
0.3355 | 12.8342 | 12000 | 0.6146 | 0.4872 |
0.3251 | 13.2620 | 12400 | 0.5979 | 0.4753 |
0.309 | 13.6898 | 12800 | 0.5721 | 0.4657 |
0.3065 | 14.1176 | 13200 | 0.5849 | 0.4598 |
0.2824 | 14.5455 | 13600 | 0.5872 | 0.4644 |
0.2875 | 14.9733 | 14000 | 0.5864 | 0.4540 |
0.2663 | 15.4011 | 14400 | 0.5885 | 0.4513 |
0.2711 | 15.8289 | 14800 | 0.6090 | 0.4553 |
0.2566 | 16.2567 | 15200 | 0.6312 | 0.4532 |
0.2524 | 16.6845 | 15600 | 0.6248 | 0.4450 |
0.2528 | 17.1123 | 16000 | 0.6329 | 0.4390 |
0.2381 | 17.5401 | 16400 | 0.6040 | 0.4370 |
0.2336 | 17.9679 | 16800 | 0.5855 | 0.4327 |
0.2184 | 18.3957 | 17200 | 0.6107 | 0.4327 |
0.2253 | 18.8235 | 17600 | 0.6087 | 0.4316 |
0.2169 | 19.2513 | 18000 | 0.6169 | 0.4261 |
0.2142 | 19.6791 | 18400 | 0.6025 | 0.4321 |
0.2125 | 20.1070 | 18800 | 0.6478 | 0.4261 |
0.1994 | 20.5348 | 19200 | 0.6504 | 0.4238 |
0.2025 | 20.9626 | 19600 | 0.6580 | 0.4229 |
0.1954 | 21.3904 | 20000 | 0.6401 | 0.4170 |
0.1939 | 21.8182 | 20400 | 0.6443 | 0.4119 |
0.1865 | 22.2460 | 20800 | 0.6588 | 0.4140 |
0.1847 | 22.6738 | 21200 | 0.6463 | 0.4087 |
0.185 | 23.1016 | 21600 | 0.6490 | 0.4058 |
0.1796 | 23.5294 | 22000 | 0.6653 | 0.4070 |
0.1745 | 23.9572 | 22400 | 0.6452 | 0.4042 |
0.173 | 24.3850 | 22800 | 0.6895 | 0.4018 |
0.1653 | 24.8128 | 23200 | 0.6482 | 0.4017 |
0.165 | 25.2406 | 23600 | 0.6620 | 0.3962 |
0.1622 | 25.6684 | 24000 | 0.6702 | 0.3971 |
0.1565 | 26.0963 | 24400 | 0.6899 | 0.3985 |
0.1563 | 26.5241 | 24800 | 0.7042 | 0.3932 |
0.1555 | 26.9519 | 25200 | 0.7017 | 0.3931 |
0.1548 | 27.3797 | 25600 | 0.6751 | 0.3895 |
0.1543 | 27.8075 | 26000 | 0.6831 | 0.3895 |
0.1464 | 28.2353 | 26400 | 0.6765 | 0.3842 |
0.1475 | 28.6631 | 26800 | 0.6842 | 0.3858 |
0.144 | 29.0909 | 27200 | 0.6904 | 0.3851 |
0.1461 | 29.5187 | 27600 | 0.6821 | 0.3834 |
0.1417 | 29.9465 | 28000 | 0.6797 | 0.3822 |
Framework versions
- Transformers 4.41.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1