Edit model card

dit-base_tobacco-tiny_tobacco3482_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6206
  • Accuracy: 0.825
  • Brier Loss: 0.2570
  • Nll: 0.9939
  • F1 Micro: 0.825
  • F1 Macro: 0.8166
  • Ece: 0.1370
  • Aurc: 0.0444

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 4.0511 0.22 0.9320 7.5162 0.22 0.0792 0.3104 0.7615
No log 2.0 14 3.4353 0.35 0.8214 5.1797 0.35 0.2316 0.2589 0.6234
No log 3.0 21 2.6406 0.47 0.6828 2.9202 0.47 0.3725 0.2781 0.3277
No log 4.0 28 2.0027 0.57 0.5596 1.8624 0.57 0.4971 0.2526 0.2124
No log 5.0 35 1.5018 0.65 0.4518 1.7094 0.65 0.6128 0.2242 0.1396
No log 6.0 42 1.3904 0.71 0.4105 1.8765 0.7100 0.7058 0.2202 0.1126
No log 7.0 49 1.1226 0.76 0.3558 1.8024 0.76 0.7029 0.1815 0.0841
No log 8.0 56 1.1810 0.73 0.3716 1.5642 0.7300 0.7027 0.1956 0.0834
No log 9.0 63 1.2131 0.73 0.3811 1.7544 0.7300 0.6774 0.2070 0.0872
No log 10.0 70 1.3986 0.72 0.4043 2.0161 0.72 0.7259 0.2021 0.1098
No log 11.0 77 1.1001 0.765 0.3202 1.9113 0.765 0.7578 0.1859 0.0678
No log 12.0 84 1.0429 0.77 0.3487 1.2955 0.7700 0.7663 0.1910 0.0827
No log 13.0 91 0.9864 0.77 0.3227 1.3721 0.7700 0.7734 0.1692 0.0710
No log 14.0 98 1.0068 0.74 0.3581 1.3362 0.74 0.7271 0.1848 0.0804
No log 15.0 105 0.8635 0.795 0.3009 1.4785 0.795 0.7810 0.1646 0.0538
No log 16.0 112 0.8157 0.81 0.2845 1.2525 0.81 0.7931 0.1545 0.0519
No log 17.0 119 0.8616 0.78 0.3186 1.4230 0.78 0.7705 0.1610 0.0647
No log 18.0 126 0.8034 0.8 0.2784 1.4410 0.8000 0.7811 0.1576 0.0489
No log 19.0 133 0.7601 0.805 0.2697 1.2885 0.805 0.7823 0.1499 0.0494
No log 20.0 140 0.7598 0.82 0.2709 1.3643 0.82 0.8090 0.1542 0.0516
No log 21.0 147 0.8221 0.79 0.2905 1.4031 0.79 0.7640 0.1612 0.0585
No log 22.0 154 0.7271 0.825 0.2599 1.0950 0.825 0.8147 0.1381 0.0454
No log 23.0 161 0.7556 0.795 0.2891 1.1111 0.795 0.7822 0.1413 0.0558
No log 24.0 168 0.7197 0.81 0.2759 1.1361 0.81 0.7905 0.1617 0.0500
No log 25.0 175 0.7192 0.83 0.2620 1.3395 0.83 0.8155 0.1459 0.0433
No log 26.0 182 0.7347 0.805 0.2821 1.1396 0.805 0.7868 0.1512 0.0541
No log 27.0 189 0.7402 0.815 0.2805 1.3562 0.815 0.7928 0.1489 0.0519
No log 28.0 196 0.6986 0.815 0.2562 1.1454 0.815 0.7944 0.1467 0.0443
No log 29.0 203 0.7148 0.81 0.2718 1.1404 0.81 0.7944 0.1440 0.0513
No log 30.0 210 0.7041 0.81 0.2796 1.3773 0.81 0.7998 0.1484 0.0494
No log 31.0 217 0.7428 0.815 0.2823 1.1146 0.815 0.7967 0.1626 0.0542
No log 32.0 224 0.6941 0.82 0.2682 1.1921 0.82 0.8098 0.1427 0.0478
No log 33.0 231 0.7170 0.81 0.2794 1.2244 0.81 0.7875 0.1407 0.0511
No log 34.0 238 0.7024 0.815 0.2805 1.0423 0.815 0.8043 0.1560 0.0512
No log 35.0 245 0.7299 0.81 0.2710 1.1835 0.81 0.7964 0.1475 0.0530
No log 36.0 252 0.6488 0.83 0.2500 1.1662 0.83 0.8117 0.1315 0.0431
No log 37.0 259 0.6877 0.815 0.2751 1.0878 0.815 0.7973 0.1381 0.0489
No log 38.0 266 0.7019 0.84 0.2620 1.2709 0.8400 0.8282 0.1607 0.0498
No log 39.0 273 0.6687 0.81 0.2680 1.3004 0.81 0.7959 0.1346 0.0465
No log 40.0 280 0.6813 0.81 0.2809 1.0539 0.81 0.7929 0.1628 0.0500
No log 41.0 287 0.6525 0.83 0.2493 1.1496 0.83 0.8176 0.1413 0.0437
No log 42.0 294 0.6526 0.835 0.2547 1.2429 0.835 0.8253 0.1420 0.0450
No log 43.0 301 0.6696 0.82 0.2717 1.0446 0.82 0.8118 0.1486 0.0501
No log 44.0 308 0.6555 0.83 0.2626 0.9948 0.83 0.8214 0.1366 0.0461
No log 45.0 315 0.6380 0.82 0.2600 1.2151 0.82 0.8026 0.1263 0.0428
No log 46.0 322 0.6356 0.82 0.2571 1.0923 0.82 0.8114 0.1443 0.0449
No log 47.0 329 0.6444 0.815 0.2638 1.0657 0.815 0.7980 0.1503 0.0476
No log 48.0 336 0.6337 0.82 0.2676 1.0650 0.82 0.8077 0.1370 0.0442
No log 49.0 343 0.6271 0.84 0.2541 1.1500 0.8400 0.8230 0.1365 0.0422
No log 50.0 350 0.6284 0.81 0.2588 1.2703 0.81 0.7964 0.1411 0.0425
No log 51.0 357 0.6507 0.82 0.2612 1.1306 0.82 0.7996 0.1558 0.0460
No log 52.0 364 0.6329 0.825 0.2602 1.2060 0.825 0.8146 0.1296 0.0439
No log 53.0 371 0.6342 0.825 0.2574 1.0132 0.825 0.8158 0.1467 0.0434
No log 54.0 378 0.6486 0.82 0.2633 1.1662 0.82 0.8060 0.1445 0.0466
No log 55.0 385 0.6245 0.825 0.2588 1.1358 0.825 0.8088 0.1428 0.0429
No log 56.0 392 0.6303 0.815 0.2616 0.9843 0.815 0.8013 0.1447 0.0458
No log 57.0 399 0.6196 0.82 0.2545 1.1936 0.82 0.8076 0.1516 0.0438
No log 58.0 406 0.6241 0.82 0.2620 1.0557 0.82 0.8100 0.1423 0.0450
No log 59.0 413 0.6278 0.82 0.2579 1.0777 0.82 0.8076 0.1382 0.0451
No log 60.0 420 0.6385 0.81 0.2651 0.9962 0.81 0.7910 0.1565 0.0467
No log 61.0 427 0.6328 0.82 0.2619 0.9968 0.82 0.8103 0.1299 0.0469
No log 62.0 434 0.6195 0.82 0.2571 0.9997 0.82 0.8062 0.1471 0.0438
No log 63.0 441 0.6150 0.825 0.2560 1.0061 0.825 0.8166 0.1498 0.0430
No log 64.0 448 0.6201 0.825 0.2574 1.0592 0.825 0.8166 0.1369 0.0442
No log 65.0 455 0.6281 0.815 0.2601 0.9990 0.815 0.8013 0.1449 0.0459
No log 66.0 462 0.6232 0.825 0.2538 1.0657 0.825 0.8166 0.1341 0.0442
No log 67.0 469 0.6242 0.82 0.2567 1.0622 0.82 0.8100 0.1432 0.0445
No log 68.0 476 0.6213 0.82 0.2598 1.0666 0.82 0.8100 0.1517 0.0447
No log 69.0 483 0.6268 0.82 0.2577 1.0106 0.82 0.8100 0.1365 0.0455
No log 70.0 490 0.6252 0.82 0.2579 0.9979 0.82 0.8100 0.1395 0.0451
No log 71.0 497 0.6251 0.82 0.2589 1.0606 0.82 0.8100 0.1485 0.0448
0.3286 72.0 504 0.6212 0.825 0.2571 1.0034 0.825 0.8166 0.1448 0.0443
0.3286 73.0 511 0.6212 0.82 0.2584 0.9940 0.82 0.8100 0.1499 0.0444
0.3286 74.0 518 0.6214 0.82 0.2576 0.9914 0.82 0.8100 0.1411 0.0448
0.3286 75.0 525 0.6233 0.82 0.2580 0.9966 0.82 0.8100 0.1592 0.0450
0.3286 76.0 532 0.6214 0.82 0.2568 0.9952 0.82 0.8100 0.1404 0.0448
0.3286 77.0 539 0.6217 0.825 0.2575 0.9951 0.825 0.8166 0.1361 0.0445
0.3286 78.0 546 0.6220 0.82 0.2569 0.9964 0.82 0.8100 0.1385 0.0450
0.3286 79.0 553 0.6225 0.82 0.2581 0.9950 0.82 0.8100 0.1485 0.0450
0.3286 80.0 560 0.6213 0.82 0.2578 0.9912 0.82 0.8100 0.1381 0.0446
0.3286 81.0 567 0.6209 0.82 0.2572 0.9948 0.82 0.8100 0.1415 0.0447
0.3286 82.0 574 0.6213 0.82 0.2578 0.9958 0.82 0.8100 0.1422 0.0449
0.3286 83.0 581 0.6220 0.82 0.2579 0.9947 0.82 0.8100 0.1553 0.0448
0.3286 84.0 588 0.6212 0.82 0.2574 0.9915 0.82 0.8100 0.1418 0.0447
0.3286 85.0 595 0.6220 0.82 0.2579 0.9937 0.82 0.8100 0.1628 0.0450
0.3286 86.0 602 0.6207 0.82 0.2572 0.9945 0.82 0.8100 0.1412 0.0447
0.3286 87.0 609 0.6212 0.82 0.2573 0.9940 0.82 0.8100 0.1414 0.0447
0.3286 88.0 616 0.6201 0.825 0.2570 0.9943 0.825 0.8166 0.1366 0.0443
0.3286 89.0 623 0.6210 0.82 0.2573 0.9944 0.82 0.8100 0.1414 0.0448
0.3286 90.0 630 0.6207 0.82 0.2572 0.9942 0.82 0.8100 0.1414 0.0447
0.3286 91.0 637 0.6210 0.82 0.2572 0.9952 0.82 0.8100 0.1415 0.0447
0.3286 92.0 644 0.6205 0.82 0.2572 0.9939 0.82 0.8100 0.1414 0.0447
0.3286 93.0 651 0.6207 0.825 0.2570 0.9938 0.825 0.8166 0.1373 0.0445
0.3286 94.0 658 0.6206 0.82 0.2572 0.9945 0.82 0.8100 0.1414 0.0447
0.3286 95.0 665 0.6203 0.825 0.2568 0.9951 0.825 0.8166 0.1370 0.0444
0.3286 96.0 672 0.6205 0.82 0.2571 0.9942 0.82 0.8100 0.1413 0.0448
0.3286 97.0 679 0.6206 0.825 0.2570 0.9943 0.825 0.8166 0.1370 0.0445
0.3286 98.0 686 0.6206 0.825 0.2570 0.9942 0.825 0.8166 0.1370 0.0445
0.3286 99.0 693 0.6206 0.825 0.2570 0.9940 0.825 0.8166 0.1370 0.0445
0.3286 100.0 700 0.6206 0.825 0.2570 0.9939 0.825 0.8166 0.1370 0.0444

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
9
Safetensors
Model size
5.53M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-tiny_tobacco3482_kd_CEKD_t2.5_a0.5

Finetuned
(13)
this model