--- license: mit base_model: facebook/w2v-bert-2.0 tags: - generated_from_trainer metrics: - wer model-index: - name: w2v-bert-2.0-malayalam_mixeddataset_thre results: [] --- # w2v-bert-2.0-malayalam_mixeddataset_thre This model is a fine-tuned version of [facebook/w2v-bert-2.0](https://huggingface.co/facebook/w2v-bert-2.0) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1604 - Wer: 0.1244 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 1.1974 | 0.47 | 600 | 0.3732 | 0.4971 | | 0.1677 | 0.95 | 1200 | 0.2552 | 0.3411 | | 0.1229 | 1.42 | 1800 | 0.2184 | 0.3123 | | 0.1041 | 1.9 | 2400 | 0.2044 | 0.2921 | | 0.0825 | 2.37 | 3000 | 0.2150 | 0.2667 | | 0.0756 | 2.85 | 3600 | 0.1882 | 0.2361 | | 0.0627 | 3.32 | 4200 | 0.1735 | 0.2493 | | 0.0557 | 3.8 | 4800 | 0.1653 | 0.2117 | | 0.0454 | 4.27 | 5400 | 0.1669 | 0.1891 | | 0.0394 | 4.74 | 6000 | 0.1610 | 0.1903 | | 0.0363 | 5.22 | 6600 | 0.1654 | 0.1699 | | 0.0278 | 5.69 | 7200 | 0.1465 | 0.1640 | | 0.025 | 6.17 | 7800 | 0.1503 | 0.1617 | | 0.0198 | 6.64 | 8400 | 0.1429 | 0.1466 | | 0.0174 | 7.12 | 9000 | 0.1440 | 0.1453 | | 0.013 | 7.59 | 9600 | 0.1496 | 0.1433 | | 0.0125 | 8.07 | 10200 | 0.1465 | 0.1274 | | 0.0076 | 8.54 | 10800 | 0.1479 | 0.1349 | | 0.0076 | 9.02 | 11400 | 0.1521 | 0.1229 | | 0.0041 | 9.49 | 12000 | 0.1600 | 0.1291 | | 0.0038 | 9.96 | 12600 | 0.1604 | 0.1244 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.1+cu121 - Datasets 2.16.1 - Tokenizers 0.15.1