Edit model card

w2v-bert-2.0-15red

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2430
  • Wer: 0.1037

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 17000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.6691 0.4261 500 0.6233 0.5383
0.45 0.8522 1000 0.4130 0.3675
0.3758 1.2782 1500 0.3631 0.3105
0.2521 1.7043 2000 0.3245 0.2777
0.2897 2.1304 2500 0.3024 0.2466
0.23 2.5565 3000 0.2813 0.2389
0.2937 2.9825 3500 0.2713 0.2236
0.1833 3.4086 4000 0.2600 0.2055
0.1375 3.8347 4500 0.2424 0.1910
0.2097 4.2608 5000 0.2376 0.1856
0.1676 4.6868 5500 0.2304 0.1839
0.1268 5.1129 6000 0.2328 0.1687
0.1229 5.5390 6500 0.2274 0.1646
0.1116 5.9651 7000 0.2103 0.1562
0.2322 6.3911 7500 0.2080 0.1540
0.1592 6.8172 8000 0.2151 0.1496
0.0796 7.2433 8500 0.2065 0.1401
0.0774 7.6694 9000 0.2036 0.1373
0.0979 8.0954 9500 0.2109 0.1361
0.0916 8.5215 10000 0.2082 0.1320
0.1057 8.9476 10500 0.2080 0.1294
0.0642 9.3737 11000 0.2032 0.1245
0.0585 9.7997 11500 0.1974 0.1232
0.0531 10.2258 12000 0.2108 0.1203
0.049 10.6519 12500 0.2027 0.1155
0.0431 11.0780 13000 0.2065 0.1152
0.0454 11.5040 13500 0.2167 0.1122
0.0236 11.9301 14000 0.2195 0.1113
0.0313 12.3562 14500 0.2314 0.1080
0.0452 12.7823 15000 0.2231 0.1063
0.0159 13.2084 15500 0.2259 0.1057
0.0166 13.6344 16000 0.2355 0.1043
0.0175 14.0605 16500 0.2340 0.1040
0.0169 14.4866 17000 0.2430 0.1037

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu124
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
8
Safetensors
Model size
606M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for meiiny00/w2v-bert-2.0-15red

Finetuned
(187)
this model