lucas-meyer's picture
Update README.md
5ba5f8c
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: seq-xls-r-fleurs_nl-run2-asr_af-run2
    results: []
datasets:
  - lucas-meyer/asr_af

seq-xls-r-fleurs_nl-run2-asr_af-run2

This model is a fine-tuned version of lucas-meyer/xls-r-fleurs_nl-run2 on the asr_af dataset. It achieves the following results:

  • Wer (Validation): 38.75%
  • Wer (Test): 38.66%

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 3
  • total_train_batch_size: 12
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer (Train)
7.6065 0.44 100 3.3086 1.0
3.055 0.88 200 2.9676 0.9998
2.7713 1.32 300 1.9810 0.9998
1.3251 1.76 400 0.8096 0.6136
0.7431 2.2 500 0.6821 0.5622
0.5789 2.64 600 0.5596 0.5133
0.4866 3.08 700 0.4707 0.4381
0.3558 3.52 800 0.4653 0.4353
0.3362 3.96 900 0.4878 0.4235
0.2631 4.41 1000 0.4621 0.3907
0.2667 4.85 1100 0.4746 0.3841
0.2464 5.29 1200 0.4383 0.3780
0.205 5.73 1300 0.4207 0.3877
0.1939 6.17 1400 0.4490 0.3746
0.1644 6.61 1500 0.4325 0.3549
0.1782 7.05 1600 0.4699 0.3791

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.4
  • Tokenizers 0.13.3