Edit model card

dit-base_tobacco-small_tobacco3482_kd_NKD_t1.0_g1.5

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.1084
  • Accuracy: 0.825
  • Brier Loss: 0.2907
  • Nll: 1.2013
  • F1 Micro: 0.825
  • F1 Macro: 0.8171
  • Ece: 0.1500
  • Aurc: 0.0459

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 5.5631 0.135 0.9164 5.3726 0.135 0.1126 0.2543 0.8397
No log 2.0 14 4.9048 0.35 0.8238 3.0911 0.35 0.2637 0.3348 0.6709
No log 3.0 21 4.1439 0.49 0.6650 1.8580 0.49 0.4532 0.2990 0.2902
No log 4.0 28 3.5518 0.66 0.4867 1.6397 0.66 0.6303 0.2902 0.1489
No log 5.0 35 3.3371 0.755 0.3981 1.6213 0.755 0.7261 0.2670 0.0984
No log 6.0 42 3.4978 0.69 0.4211 1.5668 0.69 0.6792 0.2240 0.1170
No log 7.0 49 3.0945 0.795 0.3094 1.5507 0.795 0.7653 0.1765 0.0622
No log 8.0 56 3.0882 0.775 0.3056 1.5470 0.775 0.7500 0.1826 0.0634
No log 9.0 63 3.1861 0.745 0.3331 1.6432 0.745 0.7362 0.1822 0.0754
No log 10.0 70 2.9849 0.81 0.2789 1.5850 0.81 0.7802 0.1559 0.0548
No log 11.0 77 3.0131 0.795 0.3012 1.4820 0.795 0.7720 0.1627 0.0567
No log 12.0 84 2.9054 0.795 0.2734 1.4141 0.795 0.7843 0.1501 0.0535
No log 13.0 91 2.9704 0.815 0.2720 1.4241 0.815 0.8144 0.1584 0.0536
No log 14.0 98 2.9393 0.815 0.2627 1.4735 0.815 0.7902 0.1582 0.0504
No log 15.0 105 3.0346 0.805 0.2963 1.3649 0.805 0.7973 0.1617 0.0564
No log 16.0 112 2.9648 0.79 0.2839 1.6270 0.79 0.7722 0.1418 0.0525
No log 17.0 119 3.0458 0.82 0.2960 1.3476 0.82 0.8048 0.1575 0.0622
No log 18.0 126 2.8571 0.82 0.2754 1.3958 0.82 0.8081 0.1482 0.0493
No log 19.0 133 2.9429 0.775 0.2971 1.4302 0.775 0.7617 0.1616 0.0575
No log 20.0 140 2.8274 0.825 0.2698 1.3759 0.825 0.8081 0.1520 0.0449
No log 21.0 147 2.8769 0.81 0.2713 1.3604 0.81 0.8086 0.1390 0.0466
No log 22.0 154 2.8787 0.805 0.2694 1.3016 0.805 0.7975 0.1522 0.0435
No log 23.0 161 2.8771 0.825 0.2646 1.4753 0.825 0.8215 0.1414 0.0485
No log 24.0 168 2.8950 0.805 0.2774 1.2783 0.805 0.7754 0.1406 0.0495
No log 25.0 175 2.9780 0.825 0.2829 1.3207 0.825 0.8332 0.1402 0.0496
No log 26.0 182 2.8906 0.82 0.2653 1.3097 0.82 0.8007 0.1380 0.0454
No log 27.0 189 2.9385 0.82 0.2778 1.3039 0.82 0.8211 0.1489 0.0469
No log 28.0 196 2.8644 0.83 0.2618 1.4004 0.83 0.8325 0.1358 0.0494
No log 29.0 203 2.8761 0.82 0.2720 1.2220 0.82 0.8192 0.1411 0.0463
No log 30.0 210 2.8594 0.83 0.2620 1.3323 0.83 0.8130 0.1257 0.0448
No log 31.0 217 2.8946 0.825 0.2658 1.3388 0.825 0.8236 0.1322 0.0427
No log 32.0 224 2.8698 0.825 0.2712 1.3141 0.825 0.8107 0.1467 0.0473
No log 33.0 231 2.8106 0.83 0.2563 1.3750 0.83 0.8178 0.1126 0.0422
No log 34.0 238 2.9752 0.8 0.2881 1.3007 0.8000 0.7902 0.1522 0.0499
No log 35.0 245 2.8919 0.815 0.2886 1.3057 0.815 0.8149 0.1472 0.0468
No log 36.0 252 2.8863 0.81 0.2833 1.1973 0.81 0.8006 0.1453 0.0458
No log 37.0 259 2.8283 0.845 0.2685 1.2743 0.845 0.8438 0.1481 0.0451
No log 38.0 266 2.9174 0.815 0.2825 1.2658 0.815 0.7965 0.1408 0.0530
No log 39.0 273 2.8837 0.82 0.2775 1.2946 0.82 0.8050 0.1440 0.0472
No log 40.0 280 2.8585 0.835 0.2654 1.2830 0.835 0.8169 0.1450 0.0467
No log 41.0 287 2.9323 0.82 0.2809 1.2833 0.82 0.8085 0.1342 0.0490
No log 42.0 294 2.9525 0.82 0.2847 1.2331 0.82 0.8055 0.1352 0.0481
No log 43.0 301 2.9005 0.83 0.2819 1.2643 0.83 0.8225 0.1548 0.0482
No log 44.0 308 2.8388 0.83 0.2634 1.2662 0.83 0.8152 0.1286 0.0460
No log 45.0 315 2.8962 0.82 0.2752 1.3291 0.82 0.8127 0.1442 0.0496
No log 46.0 322 2.9479 0.815 0.2883 1.2433 0.815 0.7968 0.1540 0.0523
No log 47.0 329 2.8795 0.825 0.2737 1.2477 0.825 0.8260 0.1295 0.0447
No log 48.0 336 2.9872 0.815 0.2992 1.2556 0.815 0.8029 0.1379 0.0510
No log 49.0 343 2.8156 0.84 0.2536 1.2715 0.8400 0.8263 0.1240 0.0422
No log 50.0 350 2.9534 0.81 0.2924 1.3383 0.81 0.7937 0.1471 0.0478
No log 51.0 357 2.8604 0.855 0.2549 1.2566 0.855 0.8547 0.1318 0.0411
No log 52.0 364 2.9769 0.825 0.2828 1.2325 0.825 0.8160 0.1407 0.0480
No log 53.0 371 2.8717 0.84 0.2635 1.2511 0.8400 0.8342 0.1254 0.0434
No log 54.0 378 2.9313 0.825 0.2704 1.2676 0.825 0.8159 0.1310 0.0477
No log 55.0 385 2.8552 0.82 0.2638 1.2417 0.82 0.8031 0.1490 0.0435
No log 56.0 392 2.9680 0.845 0.2729 1.2530 0.845 0.8414 0.1349 0.0452
No log 57.0 399 2.9440 0.83 0.2796 1.2344 0.83 0.8222 0.1367 0.0450
No log 58.0 406 3.0577 0.815 0.2913 1.2232 0.815 0.8068 0.1447 0.0488
No log 59.0 413 2.8861 0.835 0.2643 1.2618 0.835 0.8280 0.1354 0.0422
No log 60.0 420 3.0007 0.825 0.2822 1.2352 0.825 0.8136 0.1342 0.0449
No log 61.0 427 2.9368 0.835 0.2746 1.2437 0.835 0.8258 0.1402 0.0437
No log 62.0 434 2.9202 0.835 0.2709 1.2281 0.835 0.8258 0.1435 0.0435
No log 63.0 441 2.9720 0.835 0.2768 1.2129 0.835 0.8354 0.1444 0.0460
No log 64.0 448 2.9993 0.835 0.2815 1.2250 0.835 0.8245 0.1526 0.0451
No log 65.0 455 2.9628 0.83 0.2725 1.2477 0.83 0.8190 0.1405 0.0439
No log 66.0 462 3.0418 0.825 0.2863 1.2244 0.825 0.8142 0.1447 0.0473
No log 67.0 469 3.0196 0.83 0.2797 1.2317 0.83 0.8223 0.1450 0.0463
No log 68.0 476 3.0227 0.835 0.2834 1.2362 0.835 0.8270 0.1416 0.0446
No log 69.0 483 3.0343 0.835 0.2837 1.2377 0.835 0.8310 0.1423 0.0455
No log 70.0 490 2.9982 0.835 0.2755 1.2247 0.835 0.8245 0.1306 0.0443
No log 71.0 497 3.0230 0.825 0.2860 1.2302 0.825 0.8171 0.1376 0.0464
2.5595 72.0 504 3.0254 0.83 0.2843 1.2190 0.83 0.8222 0.1386 0.0463
2.5595 73.0 511 3.0295 0.825 0.2851 1.2206 0.825 0.8192 0.1417 0.0462
2.5595 74.0 518 3.0381 0.83 0.2845 1.2130 0.83 0.8243 0.1423 0.0457
2.5595 75.0 525 3.0258 0.825 0.2837 1.2210 0.825 0.8171 0.1431 0.0460
2.5595 76.0 532 3.0694 0.825 0.2886 1.2091 0.825 0.8171 0.1533 0.0476
2.5595 77.0 539 3.0924 0.825 0.2939 1.2130 0.825 0.8171 0.1515 0.0473
2.5595 78.0 546 3.0956 0.82 0.2921 1.2081 0.82 0.8140 0.1539 0.0482
2.5595 79.0 553 3.0859 0.825 0.2884 1.2109 0.825 0.8220 0.1480 0.0468
2.5595 80.0 560 3.0740 0.825 0.2894 1.2081 0.825 0.8136 0.1399 0.0459
2.5595 81.0 567 3.0776 0.825 0.2901 1.2066 0.825 0.8171 0.1502 0.0462
2.5595 82.0 574 3.0736 0.83 0.2869 1.2100 0.83 0.8251 0.1405 0.0462
2.5595 83.0 581 3.0943 0.825 0.2919 1.2065 0.825 0.8171 0.1503 0.0464
2.5595 84.0 588 3.0857 0.825 0.2908 1.2080 0.825 0.8171 0.1456 0.0461
2.5595 85.0 595 3.0874 0.825 0.2890 1.2063 0.825 0.8171 0.1457 0.0461
2.5595 86.0 602 3.0863 0.825 0.2880 1.2069 0.825 0.8171 0.1453 0.0459
2.5595 87.0 609 3.0844 0.825 0.2882 1.2059 0.825 0.8171 0.1457 0.0456
2.5595 88.0 616 3.1011 0.825 0.2909 1.2034 0.825 0.8171 0.1557 0.0462
2.5595 89.0 623 3.1033 0.825 0.2912 1.2033 0.825 0.8171 0.1528 0.0463
2.5595 90.0 630 3.1004 0.825 0.2903 1.2029 0.825 0.8171 0.1541 0.0461
2.5595 91.0 637 3.0998 0.825 0.2900 1.2033 0.825 0.8171 0.1499 0.0459
2.5595 92.0 644 3.1039 0.825 0.2904 1.2023 0.825 0.8171 0.1535 0.0460
2.5595 93.0 651 3.1058 0.825 0.2906 1.2020 0.825 0.8171 0.1498 0.0460
2.5595 94.0 658 3.1057 0.825 0.2906 1.2022 0.825 0.8171 0.1504 0.0459
2.5595 95.0 665 3.1066 0.825 0.2908 1.2018 0.825 0.8171 0.1509 0.0460
2.5595 96.0 672 3.1069 0.825 0.2906 1.2018 0.825 0.8171 0.1506 0.0459
2.5595 97.0 679 3.1079 0.825 0.2906 1.2013 0.825 0.8171 0.1497 0.0459
2.5595 98.0 686 3.1085 0.825 0.2907 1.2013 0.825 0.8171 0.1500 0.0459
2.5595 99.0 693 3.1083 0.825 0.2907 1.2013 0.825 0.8171 0.1499 0.0460
2.5595 100.0 700 3.1084 0.825 0.2907 1.2013 0.825 0.8171 0.1500 0.0459

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
4
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-small_tobacco3482_kd_NKD_t1.0_g1.5

Finetuned
(13)
this model