Edit model card

tiny_bert_28_hr_intents

This model is a fine-tuned version of prajjwal1/bert-tiny on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4764
  • Accuracy: 0.9314

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 238 3.1723 0.1277
No log 2.0 476 2.9582 0.3735
3.1618 3.0 714 2.7791 0.4350
3.1618 4.0 952 2.5973 0.4823
2.7996 5.0 1190 2.4334 0.5012
2.7996 6.0 1428 2.2703 0.5201
2.4391 7.0 1666 2.1174 0.5721
2.4391 8.0 1904 1.9785 0.6312
2.116 9.0 2142 1.8518 0.6501
2.116 10.0 2380 1.7367 0.6832
1.8448 11.0 2618 1.6282 0.7139
1.8448 12.0 2856 1.5288 0.7352
1.6 13.0 3094 1.4400 0.7541
1.6 14.0 3332 1.3565 0.7778
1.411 15.0 3570 1.2838 0.7896
1.411 16.0 3808 1.2116 0.8227
1.2397 17.0 4046 1.1513 0.8227
1.2397 18.0 4284 1.0895 0.8345
1.1086 19.0 4522 1.0400 0.8416
1.1086 20.0 4760 0.9865 0.8416
1.1086 21.0 4998 0.9386 0.8487
0.9884 22.0 5236 0.8953 0.8629
0.9884 23.0 5474 0.8556 0.8652
0.8911 24.0 5712 0.8223 0.8700
0.8911 25.0 5950 0.7884 0.8771
0.8159 26.0 6188 0.7545 0.8771
0.8159 27.0 6426 0.7267 0.8842
0.7358 28.0 6664 0.7005 0.8913
0.7358 29.0 6902 0.6762 0.8983
0.682 30.0 7140 0.6571 0.8960
0.682 31.0 7378 0.6334 0.9007
0.6252 32.0 7616 0.6134 0.9078
0.6252 33.0 7854 0.5990 0.9054
0.5864 34.0 8092 0.5827 0.9078
0.5864 35.0 8330 0.5656 0.9149
0.5579 36.0 8568 0.5542 0.9125
0.5579 37.0 8806 0.5436 0.9125
0.525 38.0 9044 0.5319 0.9173
0.525 39.0 9282 0.5221 0.9220
0.5001 40.0 9520 0.5143 0.9243
0.5001 41.0 9758 0.5067 0.9220
0.5001 42.0 9996 0.5007 0.9267
0.4829 43.0 10234 0.4953 0.9267
0.4829 44.0 10472 0.4897 0.9314
0.4728 45.0 10710 0.4853 0.9314
0.4728 46.0 10948 0.4832 0.9267
0.4536 47.0 11186 0.4801 0.9291
0.4536 48.0 11424 0.4777 0.9291
0.4576 49.0 11662 0.4769 0.9291
0.4576 50.0 11900 0.4764 0.9314

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for m-aliabbas1/tiny_bert_28_hr_intents

Finetuned
(50)
this model