metadata
license: mit
base_model: facebook/w2v-bert-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: w2v-bert-2.0-nonstudio_and_studioRecords
results: []
w2v-bert-2.0-nonstudio_and_studioRecords
This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1771
- Wer: 0.1179
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
1.1594 | 0.46 | 600 | 0.3721 | 0.4705 |
0.1751 | 0.92 | 1200 | 0.2652 | 0.3615 |
0.1269 | 1.38 | 1800 | 0.2069 | 0.2824 |
0.1113 | 1.84 | 2400 | 0.1867 | 0.2535 |
0.0904 | 2.3 | 3000 | 0.1907 | 0.2555 |
0.0783 | 2.76 | 3600 | 0.1740 | 0.2421 |
0.0691 | 3.22 | 4200 | 0.1860 | 0.2366 |
0.0588 | 3.68 | 4800 | 0.1696 | 0.2195 |
0.0541 | 4.14 | 5400 | 0.1560 | 0.1859 |
0.0421 | 4.6 | 6000 | 0.1812 | 0.1757 |
0.0385 | 5.06 | 6600 | 0.1643 | 0.1677 |
0.0305 | 5.52 | 7200 | 0.1457 | 0.1553 |
0.0309 | 5.98 | 7800 | 0.1494 | 0.1558 |
0.0214 | 6.44 | 8400 | 0.1516 | 0.1428 |
0.0216 | 6.9 | 9000 | 0.1409 | 0.1408 |
0.0146 | 7.36 | 9600 | 0.1524 | 0.1359 |
0.0133 | 7.82 | 10200 | 0.1494 | 0.1294 |
0.0103 | 8.28 | 10800 | 0.1600 | 0.1321 |
0.0079 | 8.74 | 11400 | 0.1658 | 0.1224 |
0.0065 | 9.2 | 12000 | 0.1644 | 0.1227 |
0.0043 | 9.66 | 12600 | 0.1771 | 0.1179 |
Framework versions
- Transformers 4.39.3
- Pytorch 2.1.1+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1