Edit model card

Millad_Customer_RN

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 4.5635
  • Wer: 0.8113
  • Cer: 0.4817

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 4000
  • num_epochs: 600
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.9257 13.33 2000 2.0606 0.9767 0.5500
1.4828 26.67 4000 2.1161 0.9019 0.4932
1.2582 40.0 6000 2.0589 0.8504 0.4942
0.9804 53.33 8000 2.4633 0.8745 0.4763
0.7862 66.67 10000 2.4794 0.8861 0.4944
0.6492 80.0 12000 2.8693 0.8554 0.4928
0.5375 93.33 14000 2.6125 0.8296 0.4802
0.4462 106.67 16000 2.7591 0.8770 0.4974
0.3873 120.0 18000 3.0325 0.8379 0.4800
0.3445 133.33 20000 2.9965 0.8761 0.4986
0.3087 146.67 22000 3.3437 0.8221 0.4923
0.2755 160.0 24000 3.3022 0.8803 0.5211
0.2467 173.33 26000 3.2348 0.8479 0.4933
0.2281 186.67 28000 3.8010 0.8695 0.5081
0.2119 200.0 30000 3.0446 0.8545 0.4902
0.194 213.33 32000 3.0873 0.8454 0.4840
0.1677 226.67 34000 3.6184 0.8645 0.5019
0.1642 240.0 36000 3.2480 0.8412 0.4903
0.1656 253.33 38000 3.4379 0.8362 0.4816
0.1371 266.67 40000 3.5117 0.8479 0.5040
0.1301 280.0 42000 3.4360 0.8404 0.4870
0.128 293.33 44000 3.6589 0.8537 0.4977
0.1152 306.67 46000 4.2359 0.8545 0.5051
0.1119 320.0 48000 3.5818 0.7980 0.4882
0.1026 333.33 50000 3.7618 0.8013 0.4865
0.0945 346.67 52000 4.2197 0.8404 0.5028
0.0962 360.0 54000 3.9231 0.8653 0.5030
0.088 373.33 56000 3.8400 0.8354 0.4914
0.0743 386.67 58000 3.4924 0.8088 0.4824
0.0811 400.0 60000 3.8370 0.8396 0.4861
0.0696 413.33 62000 4.2808 0.8412 0.5065
0.0692 426.67 64000 4.0161 0.8088 0.4744
0.0622 440.0 66000 3.9080 0.8163 0.4910
0.0591 453.33 68000 3.9838 0.8113 0.4823
0.0527 466.67 70000 3.8067 0.8329 0.4914
0.056 480.0 72000 4.1415 0.8096 0.4782
0.0535 493.33 74000 4.3350 0.8229 0.4828
0.0531 506.67 76000 3.9808 0.8071 0.4807
0.0451 520.0 78000 4.0301 0.7988 0.4816
0.044 533.33 80000 4.4680 0.8371 0.4921
0.0389 546.67 82000 4.1380 0.8121 0.4819
0.0392 560.0 84000 4.3910 0.7930 0.4763
0.0389 573.33 86000 4.5086 0.8055 0.4802
0.0355 586.67 88000 4.6259 0.8113 0.4821
0.0307 600.0 90000 4.5635 0.8113 0.4817

Framework versions

  • Transformers 4.17.0
  • Pytorch 1.12.0+cu113
  • Datasets 1.18.3
  • Tokenizers 0.12.1
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.