Edit model card

dit-base_tobacco-small_tobacco3482_kd_MSE

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7746
  • Accuracy: 0.81
  • Brier Loss: 0.2775
  • Nll: 1.1981
  • F1 Micro: 0.81
  • F1 Macro: 0.7980
  • Ece: 0.1403
  • Aurc: 0.0500

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 7.0083 0.14 0.9250 5.4655 0.14 0.1468 0.2824 0.8920
No log 2.0 14 5.8247 0.35 0.7844 3.6804 0.35 0.2240 0.2541 0.5601
No log 3.0 21 4.1788 0.49 0.6140 1.8305 0.49 0.4563 0.2512 0.2825
No log 4.0 28 2.7911 0.66 0.4534 1.6541 0.66 0.5604 0.2299 0.1475
No log 5.0 35 2.3354 0.74 0.3892 1.8678 0.74 0.6851 0.2104 0.0989
No log 6.0 42 1.9675 0.73 0.3585 1.3943 0.7300 0.6822 0.1846 0.0930
No log 7.0 49 1.7187 0.79 0.3190 1.3921 0.79 0.7510 0.1739 0.0760
No log 8.0 56 1.6507 0.77 0.3469 1.3682 0.7700 0.7289 0.1834 0.0851
No log 9.0 63 1.2713 0.79 0.3040 1.4042 0.79 0.7622 0.1505 0.0540
No log 10.0 70 1.1461 0.805 0.2852 1.3953 0.805 0.7849 0.1371 0.0522
No log 11.0 77 1.1328 0.81 0.2713 1.3113 0.81 0.7901 0.1371 0.0442
No log 12.0 84 1.2818 0.8 0.3192 1.2680 0.8000 0.7808 0.1674 0.0725
No log 13.0 91 1.0493 0.805 0.2767 1.2512 0.805 0.7846 0.1451 0.0535
No log 14.0 98 0.9657 0.815 0.2802 1.1796 0.815 0.7965 0.1680 0.0487
No log 15.0 105 0.9910 0.82 0.2695 1.3658 0.82 0.8000 0.1400 0.0475
No log 16.0 112 0.9828 0.81 0.2823 1.3175 0.81 0.7974 0.1390 0.0549
No log 17.0 119 0.9279 0.8 0.2815 1.3727 0.8000 0.7882 0.1599 0.0454
No log 18.0 126 1.0076 0.805 0.2929 1.2999 0.805 0.7825 0.1480 0.0562
No log 19.0 133 0.9524 0.82 0.2705 1.3029 0.82 0.8122 0.1481 0.0454
No log 20.0 140 1.0584 0.795 0.3010 1.3019 0.795 0.7699 0.1669 0.0650
No log 21.0 147 0.9390 0.805 0.2775 1.4073 0.805 0.7888 0.1211 0.0513
No log 22.0 154 0.9857 0.81 0.2895 1.2894 0.81 0.7879 0.1469 0.0548
No log 23.0 161 0.9137 0.795 0.2809 1.4461 0.795 0.7872 0.1528 0.0472
No log 24.0 168 0.8545 0.815 0.2844 1.2582 0.815 0.7981 0.1466 0.0484
No log 25.0 175 0.8860 0.81 0.2766 1.4525 0.81 0.8010 0.1241 0.0457
No log 26.0 182 0.8624 0.83 0.2813 1.1993 0.83 0.8222 0.1536 0.0512
No log 27.0 189 0.9119 0.805 0.2894 1.4164 0.805 0.7869 0.1576 0.0519
No log 28.0 196 0.9072 0.82 0.2753 1.2927 0.82 0.8149 0.1292 0.0514
No log 29.0 203 0.8428 0.8 0.2805 1.3065 0.8000 0.7820 0.1368 0.0502
No log 30.0 210 0.8696 0.81 0.2858 1.2825 0.81 0.7989 0.1454 0.0524
No log 31.0 217 0.8542 0.8 0.2861 1.2029 0.8000 0.7766 0.1412 0.0496
No log 32.0 224 0.8576 0.805 0.2896 1.3371 0.805 0.7814 0.1513 0.0515
No log 33.0 231 0.8615 0.8 0.2859 1.2347 0.8000 0.7826 0.1473 0.0522
No log 34.0 238 0.8474 0.805 0.2807 1.3510 0.805 0.7946 0.1493 0.0524
No log 35.0 245 0.9058 0.79 0.3035 1.2005 0.79 0.7768 0.1497 0.0553
No log 36.0 252 0.8461 0.805 0.2897 1.2770 0.805 0.7906 0.1599 0.0513
No log 37.0 259 0.8461 0.805 0.2962 1.1989 0.805 0.7912 0.1527 0.0533
No log 38.0 266 0.8646 0.815 0.2817 1.3653 0.815 0.8031 0.1355 0.0499
No log 39.0 273 0.8306 0.8 0.2905 1.1852 0.8000 0.7862 0.1528 0.0549
No log 40.0 280 0.8561 0.815 0.2838 1.2577 0.815 0.8005 0.1431 0.0544
No log 41.0 287 0.8236 0.805 0.2836 1.2093 0.805 0.7925 0.1376 0.0490
No log 42.0 294 0.8221 0.805 0.2853 1.1929 0.805 0.7805 0.1397 0.0524
No log 43.0 301 0.7834 0.815 0.2666 1.2720 0.815 0.8006 0.1316 0.0496
No log 44.0 308 0.8022 0.8 0.2839 1.2009 0.8000 0.7870 0.1457 0.0514
No log 45.0 315 0.8009 0.81 0.2735 1.3505 0.81 0.7970 0.1359 0.0494
No log 46.0 322 0.8029 0.81 0.2775 1.1956 0.81 0.7983 0.1476 0.0509
No log 47.0 329 0.7979 0.82 0.2818 1.2005 0.82 0.8049 0.1466 0.0488
No log 48.0 336 0.7763 0.815 0.2784 1.1905 0.815 0.7970 0.1358 0.0512
No log 49.0 343 0.7917 0.81 0.2802 1.2136 0.81 0.7989 0.1429 0.0486
No log 50.0 350 0.8223 0.825 0.2809 1.1860 0.825 0.8042 0.1567 0.0520
No log 51.0 357 0.7952 0.82 0.2747 1.2074 0.82 0.8078 0.1377 0.0484
No log 52.0 364 0.7868 0.825 0.2714 1.2850 0.825 0.8170 0.1371 0.0476
No log 53.0 371 0.8111 0.805 0.2869 1.1892 0.805 0.7954 0.1467 0.0524
No log 54.0 378 0.7739 0.81 0.2755 1.1946 0.81 0.7953 0.1567 0.0486
No log 55.0 385 0.7930 0.825 0.2825 1.2000 0.825 0.8087 0.1546 0.0518
No log 56.0 392 0.7826 0.815 0.2789 1.1953 0.815 0.8031 0.1353 0.0514
No log 57.0 399 0.7716 0.82 0.2714 1.3115 0.82 0.8079 0.1207 0.0470
No log 58.0 406 0.8036 0.815 0.2878 1.1875 0.815 0.7945 0.1469 0.0531
No log 59.0 413 0.7714 0.82 0.2722 1.2787 0.82 0.8128 0.1264 0.0467
No log 60.0 420 0.7671 0.825 0.2720 1.2722 0.825 0.8136 0.1378 0.0476
No log 61.0 427 0.7885 0.815 0.2834 1.1798 0.815 0.8007 0.1480 0.0526
No log 62.0 434 0.7621 0.82 0.2706 1.3459 0.82 0.8102 0.1156 0.0482
No log 63.0 441 0.7691 0.81 0.2797 1.1379 0.81 0.7959 0.1429 0.0506
No log 64.0 448 0.7699 0.81 0.2776 1.1964 0.81 0.7974 0.1473 0.0494
No log 65.0 455 0.7693 0.82 0.2739 1.2089 0.82 0.8106 0.1390 0.0481
No log 66.0 462 0.7891 0.81 0.2805 1.1989 0.81 0.7927 0.1530 0.0513
No log 67.0 469 0.7806 0.82 0.2798 1.2033 0.82 0.8068 0.1408 0.0485
No log 68.0 476 0.7877 0.82 0.2815 1.1896 0.82 0.8054 0.1376 0.0501
No log 69.0 483 0.7649 0.825 0.2731 1.1567 0.825 0.8155 0.1371 0.0479
No log 70.0 490 0.7740 0.82 0.2764 1.1929 0.82 0.8107 0.1250 0.0511
No log 71.0 497 0.7657 0.82 0.2744 1.2762 0.82 0.8068 0.1374 0.0488
0.4804 72.0 504 0.7887 0.805 0.2839 1.1851 0.805 0.7914 0.1524 0.0513
0.4804 73.0 511 0.7662 0.815 0.2759 1.1973 0.815 0.8010 0.1395 0.0496
0.4804 74.0 518 0.7706 0.825 0.2742 1.2020 0.825 0.8196 0.1398 0.0492
0.4804 75.0 525 0.7780 0.815 0.2802 1.1881 0.815 0.7970 0.1392 0.0505
0.4804 76.0 532 0.7731 0.825 0.2745 1.2695 0.825 0.8152 0.1548 0.0485
0.4804 77.0 539 0.7743 0.825 0.2762 1.2039 0.825 0.8109 0.1326 0.0490
0.4804 78.0 546 0.7782 0.805 0.2792 1.2001 0.805 0.7905 0.1381 0.0506
0.4804 79.0 553 0.7786 0.81 0.2807 1.1929 0.81 0.7980 0.1394 0.0505
0.4804 80.0 560 0.7759 0.82 0.2772 1.1973 0.82 0.8081 0.1296 0.0494
0.4804 81.0 567 0.7703 0.82 0.2758 1.2069 0.82 0.8096 0.1405 0.0491
0.4804 82.0 574 0.7749 0.81 0.2777 1.1996 0.81 0.7980 0.1502 0.0501
0.4804 83.0 581 0.7768 0.815 0.2777 1.2009 0.815 0.8052 0.1237 0.0496
0.4804 84.0 588 0.7761 0.815 0.2778 1.1986 0.815 0.8008 0.1333 0.0495
0.4804 85.0 595 0.7771 0.815 0.2780 1.1984 0.815 0.8008 0.1335 0.0497
0.4804 86.0 602 0.7755 0.81 0.2777 1.1987 0.81 0.7980 0.1327 0.0501
0.4804 87.0 609 0.7749 0.81 0.2776 1.1974 0.81 0.7980 0.1261 0.0499
0.4804 88.0 616 0.7746 0.815 0.2776 1.1981 0.815 0.8052 0.1238 0.0497
0.4804 89.0 623 0.7744 0.81 0.2776 1.1981 0.81 0.7980 0.1283 0.0500
0.4804 90.0 630 0.7743 0.81 0.2774 1.1987 0.81 0.7980 0.1346 0.0499
0.4804 91.0 637 0.7741 0.81 0.2774 1.1981 0.81 0.7980 0.1379 0.0499
0.4804 92.0 644 0.7742 0.81 0.2774 1.1982 0.81 0.7980 0.1403 0.0499
0.4804 93.0 651 0.7745 0.81 0.2775 1.1982 0.81 0.7980 0.1403 0.0500
0.4804 94.0 658 0.7746 0.81 0.2776 1.1978 0.81 0.7980 0.1316 0.0500
0.4804 95.0 665 0.7745 0.81 0.2775 1.1982 0.81 0.7980 0.1380 0.0499
0.4804 96.0 672 0.7746 0.81 0.2775 1.1981 0.81 0.7980 0.1315 0.0500
0.4804 97.0 679 0.7746 0.81 0.2775 1.1981 0.81 0.7980 0.1403 0.0500
0.4804 98.0 686 0.7746 0.81 0.2775 1.1981 0.81 0.7980 0.1403 0.0500
0.4804 99.0 693 0.7746 0.81 0.2775 1.1981 0.81 0.7980 0.1403 0.0500
0.4804 100.0 700 0.7746 0.81 0.2775 1.1981 0.81 0.7980 0.1403 0.0500

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
8
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-small_tobacco3482_kd_MSE

Finetuned
(13)
this model