Edit model card

tiny_bert_29_mva_intents

This model is a fine-tuned version of prajjwal1/bert-tiny on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3517
  • Accuracy: 0.9211

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 406 2.9779 0.3421
3.1383 2.0 812 2.6427 0.4474
2.743 3.0 1218 2.3476 0.5439
2.3913 4.0 1624 2.0961 0.6082
2.0764 5.0 2030 1.8721 0.6550
2.0764 6.0 2436 1.6858 0.6696
1.8321 7.0 2842 1.5348 0.6959
1.6047 8.0 3248 1.3980 0.7281
1.439 9.0 3654 1.2846 0.7544
1.2773 10.0 4060 1.1894 0.7632
1.2773 11.0 4466 1.0994 0.7661
1.1596 12.0 4872 1.0221 0.7982
1.0563 13.0 5278 0.9486 0.8158
0.962 14.0 5684 0.8931 0.8275
0.8814 15.0 6090 0.8328 0.8363
0.8814 16.0 6496 0.7830 0.8392
0.8114 17.0 6902 0.7394 0.8596
0.7515 18.0 7308 0.6997 0.8684
0.6936 19.0 7714 0.6659 0.8860
0.6491 20.0 8120 0.6316 0.8801
0.614 21.0 8526 0.6014 0.8860
0.614 22.0 8932 0.5799 0.8947
0.5693 23.0 9338 0.5529 0.8918
0.5362 24.0 9744 0.5368 0.9035
0.5026 25.0 10150 0.5137 0.9006
0.4733 26.0 10556 0.4944 0.9035
0.4733 27.0 10962 0.4807 0.9006
0.4582 28.0 11368 0.4646 0.9152
0.4276 29.0 11774 0.4536 0.9064
0.4047 30.0 12180 0.4377 0.9123
0.3941 31.0 12586 0.4286 0.9094
0.3941 32.0 12992 0.4188 0.9123
0.3741 33.0 13398 0.4076 0.9123
0.3613 34.0 13804 0.4024 0.9094
0.3483 35.0 14210 0.3961 0.9094
0.3437 36.0 14616 0.3918 0.9094
0.3266 37.0 15022 0.3830 0.9094
0.3266 38.0 15428 0.3770 0.9152
0.3216 39.0 15834 0.3746 0.9123
0.3042 40.0 16240 0.3704 0.9211
0.3052 41.0 16646 0.3660 0.9211
0.2962 42.0 17052 0.3655 0.9181
0.2962 43.0 17458 0.3604 0.9123
0.2912 44.0 17864 0.3584 0.9152
0.2909 45.0 18270 0.3557 0.9152
0.2854 46.0 18676 0.3546 0.9181
0.2829 47.0 19082 0.3548 0.9152
0.2829 48.0 19488 0.3530 0.9181
0.278 49.0 19894 0.3519 0.9211
0.2827 50.0 20300 0.3517 0.9211

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for m-aliabbas1/tiny_bert_29_mva_intents

Finetuned
(50)
this model