uaspeech-mms1ball-Nov30

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1373
  • Wer: 0.5222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 3
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
45.2252 0.0063 100 3.7562 1.0
2.9851 0.0127 200 2.0328 0.8281
2.0067 0.0190 300 1.8614 0.6848
2.1249 0.0254 400 2.0929 0.7247
1.8786 0.0317 500 1.6966 0.6522
1.8295 0.0381 600 1.6550 0.6579
1.8186 0.0444 700 1.6961 0.6457
1.839 0.0508 800 1.7019 0.6642
1.8854 0.0571 900 1.6492 0.6455
2.0186 0.0635 1000 1.5746 0.6682
1.8061 0.0698 1100 1.5590 0.6345
1.6994 0.0761 1200 1.6544 0.6509
1.6444 0.0825 1300 1.6297 0.6115
1.7308 0.0888 1400 1.7002 0.6337
1.8179 0.0952 1500 1.5325 0.6179
1.7174 0.1015 1600 1.5668 0.6343
1.7043 0.1079 1700 1.5762 0.6415
1.5174 0.1142 1800 1.4478 0.5991
1.6108 0.1206 1900 1.4394 0.6054
1.6227 0.1269 2000 1.5185 0.5964
1.773 0.1333 2100 1.6450 0.6124
1.6231 0.1396 2200 1.3599 0.6016
1.6005 0.1459 2300 1.3782 0.5943
1.524 0.1523 2400 1.6099 0.5953
1.5881 0.1586 2500 1.4681 0.6160
1.5577 0.1650 2600 1.3934 0.6052
1.5556 0.1713 2700 1.3960 0.6099
1.609 0.1777 2800 1.4837 0.6008
1.3661 0.1840 2900 1.2992 0.5800
1.4908 0.1904 3000 1.3942 0.5844
1.6022 0.1967 3100 1.4685 0.5907
1.7082 0.2031 3200 1.4685 0.6214
1.5526 0.2094 3300 1.4331 0.6038
1.4424 0.2157 3400 1.3674 0.5791
1.3544 0.2221 3500 1.3960 0.5882
1.4868 0.2284 3600 1.3507 0.5882
1.5313 0.2348 3700 1.5923 0.5764
1.5763 0.2411 3800 1.3493 0.5681
1.3742 0.2475 3900 1.3979 0.5827
1.5003 0.2538 4000 1.4011 0.5886
1.5894 0.2602 4100 1.3652 0.5519
1.5496 0.2665 4200 1.2970 0.5860
1.6228 0.2729 4300 1.4056 0.5753
1.4173 0.2792 4400 1.3485 0.5515
1.5236 0.2856 4500 1.2558 0.5473
1.3976 0.2919 4600 1.3535 0.5505
1.4656 0.2982 4700 1.3634 0.5865
1.4089 0.3046 4800 1.3172 0.5458
1.2844 0.3109 4900 1.3553 0.5606
1.3984 0.3173 5000 1.2458 0.5679
1.4784 0.3236 5100 1.3808 0.5484
1.4339 0.3300 5200 1.3194 0.5578
1.6029 0.3363 5300 1.3393 0.5631
1.4778 0.3427 5400 1.3053 0.5639
1.5093 0.3490 5500 1.2234 0.5445
1.5348 0.3554 5600 1.2514 0.5583
1.5195 0.3617 5700 1.2954 0.5561
1.5035 0.3680 5800 1.1854 0.5435
1.532 0.3744 5900 1.2768 0.5486
1.5833 0.3807 6000 1.3575 0.5547
1.4789 0.3871 6100 1.2984 0.5418
1.3557 0.3934 6200 1.2763 0.5326
1.2738 0.3998 6300 1.3761 0.5351
1.454 0.4061 6400 1.3057 0.5325
1.403 0.4125 6500 1.3707 0.5662
1.5306 0.4188 6600 1.3330 0.5583
1.4576 0.4252 6700 1.2403 0.5349
1.4214 0.4315 6800 1.3175 0.5363
1.5853 0.4378 6900 1.2071 0.5281
1.522 0.4442 7000 1.2593 0.5387
1.2743 0.4505 7100 1.3154 0.5549
1.3778 0.4569 7200 1.2544 0.5365
1.2875 0.4632 7300 1.2312 0.5281
1.3484 0.4696 7400 1.2599 0.5340
1.5618 0.4759 7500 1.3038 0.5568
1.4391 0.4823 7600 1.2523 0.5355
1.4676 0.4886 7700 1.2205 0.5311
1.371 0.4950 7800 1.1996 0.5275
1.361 0.5013 7900 1.1635 0.5186
1.463 0.5076 8000 1.2232 0.5344
1.348 0.5140 8100 1.1407 0.5174
1.4943 0.5203 8200 1.3098 0.5361
1.3703 0.5267 8300 1.1393 0.5226
1.4188 0.5330 8400 1.1914 0.4995
1.3048 0.5394 8500 1.1555 0.5170
1.4468 0.5457 8600 1.1515 0.5149
1.2279 0.5521 8700 1.2047 0.5168
1.2367 0.5584 8800 1.1868 0.5208
1.3536 0.5648 8900 1.1398 0.5189
1.3496 0.5711 9000 1.2705 0.5176
1.4915 0.5774 9100 1.2007 0.5079
1.474 0.5838 9200 1.2192 0.5378
1.3093 0.5901 9300 1.1065 0.5125
1.1994 0.5965 9400 1.2120 0.5201
1.502 0.6028 9500 1.1464 0.5290
1.5327 0.6092 9600 1.2010 0.5384
1.4533 0.6155 9700 1.1614 0.5146
1.4554 0.6219 9800 1.1353 0.5155
1.3605 0.6282 9900 1.1788 0.5058
1.3606 0.6346 10000 1.1877 0.5178
1.3295 0.6409 10100 1.2601 0.5452
1.4172 0.6472 10200 1.1926 0.5083
1.4859 0.6536 10300 1.1426 0.5077
1.5264 0.6599 10400 1.1495 0.5121
1.3455 0.6663 10500 1.1535 0.5140
1.2375 0.6726 10600 1.1857 0.5087
1.2808 0.6790 10700 1.2156 0.5005
1.3358 0.6853 10800 1.1543 0.5075
1.361 0.6917 10900 1.2553 0.5290
1.3859 0.6980 11000 1.1458 0.5245
1.5061 0.7044 11100 1.1547 0.5399
1.2777 0.7107 11200 1.1677 0.5180
1.2645 0.7171 11300 1.2715 0.5191
1.3514 0.7234 11400 1.1811 0.5109
1.2733 0.7297 11500 1.1079 0.5134
1.2961 0.7361 11600 1.3171 0.5188
1.3744 0.7424 11700 1.1271 0.5106
1.2626 0.7488 11800 1.0821 0.5001
1.317 0.7551 11900 1.1156 0.5037
1.1732 0.7615 12000 1.1828 0.5287
1.3693 0.7678 12100 1.1063 0.5172
1.3227 0.7742 12200 1.1450 0.4948
1.41 0.7805 12300 1.0917 0.5079
1.2909 0.7869 12400 1.1321 0.5066
1.1509 0.7932 12500 1.1678 0.4921
1.3629 0.7995 12600 1.1482 0.4923
1.2959 0.8059 12700 1.1488 0.5064
1.3537 0.8122 12800 1.1356 0.5014
1.1468 0.8186 12900 1.0872 0.4970
1.3582 0.8249 13000 1.1221 0.4913
1.3872 0.8313 13100 1.2243 0.4925
1.4433 0.8376 13200 1.1536 0.5205
1.4121 0.8440 13300 1.1322 0.5050
1.0979 0.8503 13400 1.1252 0.4927
1.3989 0.8567 13500 1.0632 0.5035
1.1527 0.8630 13600 1.1041 0.4942
1.167 0.8693 13700 1.0895 0.4913
1.3945 0.8757 13800 1.1256 0.5050
1.2615 0.8820 13900 1.1768 0.5226
1.3221 0.8884 14000 1.2068 0.5050
1.4254 0.8947 14100 1.1757 0.5028
1.1403 0.9011 14200 1.1589 0.5161
1.304 0.9074 14300 1.1399 0.5140
1.3728 0.9138 14400 1.0997 0.5132
1.4233 0.9201 14500 1.0702 0.4948
1.1177 0.9265 14600 1.0930 0.4936
1.186 0.9328 14700 1.1956 0.4919
1.3834 0.9391 14800 1.1244 0.5007
1.3236 0.9455 14900 1.1373 0.5222

Framework versions

  • Transformers 4.43.4
  • Pytorch 2.4.1
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
15
Safetensors
Model size
965M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for sqrk/uaspeech-mms1ball-Nov30

Finetuned
(185)
this model