Edit model card

dit-base_tobacco-small_tobacco3482_kd

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5105
  • Accuracy: 0.815
  • Brier Loss: 0.2790
  • Nll: 1.4944
  • F1 Micro: 0.815
  • F1 Macro: 0.7942
  • Ece: 0.1287
  • Aurc: 0.0524

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 2.2378 0.17 0.8975 4.4036 0.17 0.1418 0.2519 0.8078
No log 2.0 14 1.7484 0.38 0.7667 4.1809 0.38 0.2513 0.3132 0.4423
No log 3.0 21 1.1417 0.55 0.5683 1.8669 0.55 0.4592 0.2551 0.2287
No log 4.0 28 0.8020 0.685 0.4327 1.7476 0.685 0.6393 0.2274 0.1292
No log 5.0 35 0.8347 0.645 0.4502 1.6809 0.645 0.6306 0.1939 0.1346
No log 6.0 42 0.6546 0.735 0.3657 1.5210 0.735 0.7191 0.1995 0.0901
No log 7.0 49 0.6447 0.76 0.3375 1.5117 0.76 0.7450 0.1781 0.0875
No log 8.0 56 0.7089 0.775 0.3650 1.4823 0.775 0.7554 0.2026 0.0971
No log 9.0 63 0.5721 0.785 0.3083 1.4053 0.785 0.7633 0.1647 0.0651
No log 10.0 70 0.5953 0.795 0.3130 1.4301 0.795 0.7971 0.1661 0.0701
No log 11.0 77 0.6352 0.79 0.3131 1.5018 0.79 0.7607 0.1503 0.0789
No log 12.0 84 0.7999 0.735 0.3916 1.7141 0.735 0.7065 0.2143 0.1178
No log 13.0 91 0.6602 0.8 0.3099 1.8022 0.8000 0.7746 0.1709 0.0805
No log 14.0 98 0.6529 0.785 0.3298 1.3607 0.785 0.7658 0.1771 0.0858
No log 15.0 105 0.6170 0.8 0.3098 1.3676 0.8000 0.7838 0.1630 0.0723
No log 16.0 112 0.6484 0.775 0.3342 1.2826 0.775 0.7752 0.1837 0.0827
No log 17.0 119 0.5817 0.78 0.3019 1.6577 0.78 0.7730 0.1566 0.0582
No log 18.0 126 0.6528 0.78 0.3376 1.5044 0.78 0.7788 0.1687 0.0768
No log 19.0 133 0.6241 0.805 0.3038 1.3465 0.805 0.7796 0.1498 0.0759
No log 20.0 140 0.5610 0.79 0.2948 1.4395 0.79 0.7716 0.1515 0.0708
No log 21.0 147 0.6829 0.78 0.3241 1.3252 0.78 0.7687 0.1782 0.0852
No log 22.0 154 0.5443 0.795 0.3117 1.4374 0.795 0.7822 0.1730 0.0679
No log 23.0 161 0.6968 0.78 0.3474 1.7830 0.78 0.7880 0.1745 0.0813
No log 24.0 168 0.7422 0.75 0.3639 1.5379 0.75 0.7238 0.1982 0.0940
No log 25.0 175 0.5756 0.785 0.3150 1.4739 0.785 0.7723 0.1615 0.0675
No log 26.0 182 0.6127 0.805 0.3036 1.5553 0.805 0.7990 0.1416 0.0659
No log 27.0 189 0.5852 0.795 0.3104 1.5149 0.795 0.7808 0.1583 0.0625
No log 28.0 196 0.5421 0.83 0.2808 1.4320 0.83 0.8147 0.1475 0.0558
No log 29.0 203 0.5588 0.79 0.2888 1.5801 0.79 0.7723 0.1465 0.0580
No log 30.0 210 0.5532 0.795 0.2892 1.5724 0.795 0.7790 0.1453 0.0576
No log 31.0 217 0.5050 0.835 0.2685 1.4206 0.835 0.8221 0.1459 0.0549
No log 32.0 224 0.5067 0.82 0.2762 1.4460 0.82 0.8017 0.1494 0.0538
No log 33.0 231 0.5200 0.815 0.2798 1.5300 0.815 0.7973 0.1442 0.0541
No log 34.0 238 0.5110 0.825 0.2802 1.6009 0.825 0.8095 0.1462 0.0537
No log 35.0 245 0.5125 0.815 0.2804 1.5209 0.815 0.8013 0.1555 0.0540
No log 36.0 252 0.4981 0.82 0.2728 1.4498 0.82 0.8032 0.1557 0.0522
No log 37.0 259 0.5196 0.82 0.2796 1.5297 0.82 0.8057 0.1396 0.0523
No log 38.0 266 0.5034 0.82 0.2755 1.4577 0.82 0.8000 0.1449 0.0524
No log 39.0 273 0.5190 0.815 0.2810 1.5240 0.815 0.8003 0.1516 0.0533
No log 40.0 280 0.4926 0.83 0.2697 1.4598 0.83 0.8161 0.1248 0.0514
No log 41.0 287 0.5117 0.815 0.2808 1.5168 0.815 0.7965 0.1306 0.0525
No log 42.0 294 0.5034 0.825 0.2721 1.5263 0.825 0.8143 0.1389 0.0533
No log 43.0 301 0.5073 0.815 0.2762 1.5308 0.815 0.7916 0.1452 0.0511
No log 44.0 308 0.5017 0.825 0.2751 1.5202 0.825 0.8095 0.1473 0.0525
No log 45.0 315 0.5052 0.815 0.2783 1.5143 0.815 0.7965 0.1451 0.0525
No log 46.0 322 0.5043 0.83 0.2743 1.5172 0.83 0.8172 0.1481 0.0517
No log 47.0 329 0.5057 0.825 0.2767 1.5164 0.825 0.8089 0.1325 0.0520
No log 48.0 336 0.5033 0.82 0.2752 1.5168 0.82 0.8061 0.1430 0.0523
No log 49.0 343 0.5042 0.82 0.2755 1.5163 0.82 0.8061 0.1394 0.0517
No log 50.0 350 0.5068 0.82 0.2767 1.5153 0.82 0.8061 0.1471 0.0517
No log 51.0 357 0.5048 0.82 0.2759 1.5137 0.82 0.8061 0.1419 0.0519
No log 52.0 364 0.5044 0.825 0.2759 1.5112 0.825 0.8064 0.1342 0.0518
No log 53.0 371 0.5046 0.825 0.2756 1.5122 0.825 0.8116 0.1388 0.0514
No log 54.0 378 0.5078 0.815 0.2777 1.5111 0.815 0.7984 0.1442 0.0519
No log 55.0 385 0.5059 0.815 0.2767 1.5109 0.815 0.7984 0.1351 0.0518
No log 56.0 392 0.5087 0.82 0.2779 1.5089 0.82 0.8061 0.1391 0.0518
No log 57.0 399 0.5072 0.82 0.2771 1.5094 0.82 0.8061 0.1339 0.0517
No log 58.0 406 0.5079 0.82 0.2776 1.5074 0.82 0.8061 0.1366 0.0518
No log 59.0 413 0.5072 0.82 0.2771 1.5072 0.82 0.8061 0.1308 0.0518
No log 60.0 420 0.5084 0.825 0.2776 1.5059 0.825 0.8116 0.1303 0.0520
No log 61.0 427 0.5074 0.82 0.2772 1.5066 0.82 0.8038 0.1244 0.0520
No log 62.0 434 0.5090 0.82 0.2781 1.5053 0.82 0.8061 0.1367 0.0519
No log 63.0 441 0.5094 0.825 0.2779 1.5050 0.825 0.8116 0.1305 0.0520
No log 64.0 448 0.5098 0.82 0.2782 1.5049 0.82 0.8038 0.1314 0.0520
No log 65.0 455 0.5086 0.82 0.2780 1.5038 0.82 0.8038 0.1249 0.0520
No log 66.0 462 0.5103 0.82 0.2787 1.5023 0.82 0.8038 0.1222 0.0522
No log 67.0 469 0.5095 0.82 0.2782 1.5025 0.82 0.8038 0.1228 0.0521
No log 68.0 476 0.5095 0.82 0.2783 1.5027 0.82 0.8038 0.1330 0.0522
No log 69.0 483 0.5097 0.82 0.2785 1.5015 0.82 0.8038 0.1228 0.0521
No log 70.0 490 0.5109 0.82 0.2788 1.5005 0.82 0.8038 0.1322 0.0520
No log 71.0 497 0.5096 0.82 0.2784 1.5012 0.82 0.8038 0.1320 0.0522
0.1366 72.0 504 0.5095 0.82 0.2784 1.5011 0.82 0.8038 0.1219 0.0522
0.1366 73.0 511 0.5109 0.82 0.2791 1.4998 0.82 0.8038 0.1249 0.0523
0.1366 74.0 518 0.5100 0.82 0.2786 1.5000 0.82 0.8038 0.1219 0.0521
0.1366 75.0 525 0.5096 0.82 0.2784 1.5000 0.82 0.8038 0.1238 0.0521
0.1366 76.0 532 0.5104 0.82 0.2787 1.4988 0.82 0.8038 0.1341 0.0523
0.1366 77.0 539 0.5105 0.82 0.2788 1.4985 0.82 0.8038 0.1340 0.0521
0.1366 78.0 546 0.5103 0.82 0.2788 1.4985 0.82 0.8038 0.1338 0.0520
0.1366 79.0 553 0.5105 0.82 0.2788 1.4983 0.82 0.8038 0.1317 0.0522
0.1366 80.0 560 0.5106 0.82 0.2789 1.4977 0.82 0.8038 0.1337 0.0523
0.1366 81.0 567 0.5108 0.82 0.2790 1.4971 0.82 0.8038 0.1339 0.0523
0.1366 82.0 574 0.5107 0.82 0.2790 1.4970 0.82 0.8038 0.1317 0.0521
0.1366 83.0 581 0.5108 0.82 0.2790 1.4968 0.82 0.8038 0.1339 0.0522
0.1366 84.0 588 0.5105 0.82 0.2789 1.4966 0.82 0.8038 0.1340 0.0522
0.1366 85.0 595 0.5106 0.82 0.2789 1.4961 0.82 0.8038 0.1338 0.0523
0.1366 86.0 602 0.5109 0.82 0.2790 1.4958 0.82 0.8038 0.1336 0.0524
0.1366 87.0 609 0.5105 0.815 0.2789 1.4956 0.815 0.7942 0.1290 0.0525
0.1366 88.0 616 0.5105 0.815 0.2790 1.4954 0.815 0.7942 0.1290 0.0525
0.1366 89.0 623 0.5106 0.815 0.2790 1.4952 0.815 0.7942 0.1290 0.0526
0.1366 90.0 630 0.5106 0.82 0.2790 1.4951 0.82 0.8038 0.1338 0.0523
0.1366 91.0 637 0.5107 0.815 0.2790 1.4949 0.815 0.7942 0.1289 0.0526
0.1366 92.0 644 0.5107 0.815 0.2790 1.4947 0.815 0.7942 0.1289 0.0526
0.1366 93.0 651 0.5107 0.815 0.2790 1.4947 0.815 0.7942 0.1289 0.0525
0.1366 94.0 658 0.5107 0.82 0.2790 1.4946 0.82 0.8038 0.1335 0.0523
0.1366 95.0 665 0.5106 0.82 0.2790 1.4946 0.82 0.8038 0.1335 0.0523
0.1366 96.0 672 0.5105 0.815 0.2790 1.4945 0.815 0.7942 0.1289 0.0524
0.1366 97.0 679 0.5105 0.815 0.2790 1.4945 0.815 0.7942 0.1289 0.0524
0.1366 98.0 686 0.5105 0.815 0.2790 1.4944 0.815 0.7942 0.1289 0.0524
0.1366 99.0 693 0.5105 0.815 0.2790 1.4944 0.815 0.7942 0.1287 0.0524
0.1366 100.0 700 0.5105 0.815 0.2790 1.4944 0.815 0.7942 0.1287 0.0524

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
6
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-small_tobacco3482_kd

Finetuned
(13)
this model