swin-base_tobacco
This model is a fine-tuned version of microsoft/swinv2-base-patch4-window8-256 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6059
- Accuracy: 0.835
- Brier Loss: 0.2576
- Nll: 1.2824
- F1 Micro: 0.835
- F1 Macro: 0.8348
- Ece: 0.1310
- Aurc: 0.0387
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 0.96 | 3 | 2.3165 | 0.11 | 0.9031 | 7.6310 | 0.11 | 0.0604 | 0.2004 | 0.8718 |
No log | 1.96 | 6 | 2.2894 | 0.155 | 0.8975 | 6.8146 | 0.155 | 0.0944 | 0.2230 | 0.8555 |
No log | 2.96 | 9 | 2.2481 | 0.215 | 0.8888 | 5.1480 | 0.2150 | 0.1472 | 0.2492 | 0.8119 |
No log | 3.96 | 12 | 2.1955 | 0.275 | 0.8770 | 4.2879 | 0.275 | 0.1939 | 0.2844 | 0.6562 |
No log | 4.96 | 15 | 2.1326 | 0.36 | 0.8619 | 3.8809 | 0.36 | 0.2199 | 0.3357 | 0.4962 |
No log | 5.96 | 18 | 2.0568 | 0.375 | 0.8415 | 3.9254 | 0.375 | 0.2309 | 0.3377 | 0.4471 |
No log | 6.96 | 21 | 1.9639 | 0.375 | 0.8126 | 3.8158 | 0.375 | 0.2319 | 0.3195 | 0.4534 |
No log | 7.96 | 24 | 1.8621 | 0.375 | 0.7781 | 3.3244 | 0.375 | 0.2456 | 0.2924 | 0.4833 |
No log | 8.96 | 27 | 1.7100 | 0.44 | 0.7273 | 2.8211 | 0.44 | 0.3136 | 0.3188 | 0.3515 |
No log | 9.96 | 30 | 1.5377 | 0.535 | 0.6611 | 2.4560 | 0.535 | 0.4259 | 0.3557 | 0.2259 |
No log | 10.96 | 33 | 1.3588 | 0.595 | 0.5825 | 2.3216 | 0.595 | 0.4933 | 0.2986 | 0.1795 |
No log | 11.96 | 36 | 1.2072 | 0.62 | 0.5215 | 2.3831 | 0.62 | 0.5352 | 0.2927 | 0.1541 |
No log | 12.96 | 39 | 1.0766 | 0.67 | 0.4715 | 2.2078 | 0.67 | 0.5966 | 0.2727 | 0.1219 |
No log | 13.96 | 42 | 0.9699 | 0.675 | 0.4408 | 1.8028 | 0.675 | 0.5961 | 0.2568 | 0.1215 |
No log | 14.96 | 45 | 0.8660 | 0.68 | 0.4011 | 1.4772 | 0.68 | 0.5978 | 0.2176 | 0.1014 |
No log | 15.96 | 48 | 0.7907 | 0.725 | 0.3709 | 1.4755 | 0.7250 | 0.6768 | 0.2055 | 0.0904 |
No log | 16.96 | 51 | 0.7362 | 0.75 | 0.3501 | 1.3822 | 0.75 | 0.7077 | 0.2042 | 0.0806 |
No log | 17.96 | 54 | 0.6867 | 0.76 | 0.3322 | 1.3191 | 0.76 | 0.7177 | 0.1926 | 0.0724 |
No log | 18.96 | 57 | 0.6572 | 0.78 | 0.3203 | 1.2996 | 0.78 | 0.7424 | 0.1920 | 0.0699 |
No log | 19.96 | 60 | 0.6074 | 0.785 | 0.2967 | 1.3136 | 0.785 | 0.7686 | 0.1705 | 0.0589 |
No log | 20.96 | 63 | 0.6050 | 0.795 | 0.2956 | 1.3729 | 0.795 | 0.7793 | 0.1762 | 0.0600 |
No log | 21.96 | 66 | 0.5748 | 0.83 | 0.2785 | 1.3558 | 0.83 | 0.8113 | 0.1744 | 0.0529 |
No log | 22.96 | 69 | 0.5722 | 0.815 | 0.2756 | 1.3937 | 0.815 | 0.8097 | 0.1767 | 0.0489 |
No log | 23.96 | 72 | 0.5689 | 0.795 | 0.2750 | 1.3641 | 0.795 | 0.7947 | 0.1452 | 0.0539 |
No log | 24.96 | 75 | 0.5536 | 0.825 | 0.2718 | 1.2773 | 0.825 | 0.8068 | 0.1698 | 0.0509 |
No log | 25.96 | 78 | 0.5464 | 0.805 | 0.2726 | 1.2772 | 0.805 | 0.7888 | 0.1499 | 0.0487 |
No log | 26.96 | 81 | 0.5455 | 0.81 | 0.2626 | 1.3607 | 0.81 | 0.8080 | 0.1750 | 0.0471 |
No log | 27.96 | 84 | 0.5542 | 0.815 | 0.2609 | 1.3643 | 0.815 | 0.8089 | 0.1521 | 0.0466 |
No log | 28.96 | 87 | 0.5480 | 0.82 | 0.2710 | 1.2996 | 0.82 | 0.8227 | 0.1422 | 0.0468 |
No log | 29.96 | 90 | 0.5507 | 0.83 | 0.2654 | 1.3425 | 0.83 | 0.8320 | 0.1491 | 0.0475 |
No log | 30.96 | 93 | 0.5608 | 0.815 | 0.2591 | 1.4365 | 0.815 | 0.8145 | 0.1405 | 0.0442 |
No log | 31.96 | 96 | 0.5473 | 0.825 | 0.2622 | 1.3600 | 0.825 | 0.8198 | 0.1339 | 0.0424 |
No log | 32.96 | 99 | 0.5296 | 0.83 | 0.2588 | 1.2906 | 0.83 | 0.8311 | 0.1373 | 0.0416 |
No log | 33.96 | 102 | 0.5370 | 0.82 | 0.2522 | 1.2895 | 0.82 | 0.8214 | 0.1428 | 0.0436 |
No log | 34.96 | 105 | 0.5578 | 0.8 | 0.2707 | 1.3364 | 0.8000 | 0.8056 | 0.1708 | 0.0481 |
No log | 35.96 | 108 | 0.5193 | 0.825 | 0.2484 | 1.2883 | 0.825 | 0.8250 | 0.1316 | 0.0405 |
No log | 36.96 | 111 | 0.5306 | 0.815 | 0.2569 | 1.2856 | 0.815 | 0.8093 | 0.1344 | 0.0420 |
No log | 37.96 | 114 | 0.5824 | 0.815 | 0.2729 | 1.3994 | 0.815 | 0.8182 | 0.1418 | 0.0479 |
No log | 38.96 | 117 | 0.5486 | 0.82 | 0.2549 | 1.2974 | 0.82 | 0.8259 | 0.1312 | 0.0443 |
No log | 39.96 | 120 | 0.5421 | 0.83 | 0.2545 | 1.3575 | 0.83 | 0.8316 | 0.1491 | 0.0415 |
No log | 40.96 | 123 | 0.5477 | 0.81 | 0.2700 | 1.3251 | 0.81 | 0.8166 | 0.1499 | 0.0418 |
No log | 41.96 | 126 | 0.5404 | 0.825 | 0.2553 | 1.3186 | 0.825 | 0.8309 | 0.1519 | 0.0414 |
No log | 42.96 | 129 | 0.5698 | 0.83 | 0.2598 | 1.3249 | 0.83 | 0.8386 | 0.1396 | 0.0452 |
No log | 43.96 | 132 | 0.5538 | 0.815 | 0.2605 | 1.3122 | 0.815 | 0.8212 | 0.1410 | 0.0430 |
No log | 44.96 | 135 | 0.5369 | 0.81 | 0.2586 | 1.3030 | 0.81 | 0.8141 | 0.1404 | 0.0409 |
No log | 45.96 | 138 | 0.5614 | 0.825 | 0.2615 | 1.3881 | 0.825 | 0.8278 | 0.1404 | 0.0427 |
No log | 46.96 | 141 | 0.5636 | 0.825 | 0.2601 | 1.4077 | 0.825 | 0.8286 | 0.1345 | 0.0421 |
No log | 47.96 | 144 | 0.5783 | 0.83 | 0.2684 | 1.3350 | 0.83 | 0.8304 | 0.1373 | 0.0422 |
No log | 48.96 | 147 | 0.5749 | 0.825 | 0.2663 | 1.3167 | 0.825 | 0.8241 | 0.1308 | 0.0424 |
No log | 49.96 | 150 | 0.5802 | 0.82 | 0.2692 | 1.3191 | 0.82 | 0.8194 | 0.1217 | 0.0461 |
No log | 50.96 | 153 | 0.5696 | 0.82 | 0.2639 | 1.3330 | 0.82 | 0.8175 | 0.1372 | 0.0429 |
No log | 51.96 | 156 | 0.5827 | 0.84 | 0.2656 | 1.3975 | 0.8400 | 0.8444 | 0.1378 | 0.0426 |
No log | 52.96 | 159 | 0.5725 | 0.805 | 0.2669 | 1.3172 | 0.805 | 0.7997 | 0.1459 | 0.0422 |
No log | 53.96 | 162 | 0.5769 | 0.805 | 0.2691 | 1.3111 | 0.805 | 0.7991 | 0.1457 | 0.0434 |
No log | 54.96 | 165 | 0.5883 | 0.805 | 0.2647 | 1.4581 | 0.805 | 0.8104 | 0.1405 | 0.0430 |
No log | 55.96 | 168 | 0.5834 | 0.835 | 0.2543 | 1.4586 | 0.835 | 0.8349 | 0.1346 | 0.0407 |
No log | 56.96 | 171 | 0.5875 | 0.835 | 0.2543 | 1.3211 | 0.835 | 0.8358 | 0.1320 | 0.0402 |
No log | 57.96 | 174 | 0.5741 | 0.84 | 0.2533 | 1.3027 | 0.8400 | 0.8405 | 0.1290 | 0.0395 |
No log | 58.96 | 177 | 0.5737 | 0.82 | 0.2624 | 1.3104 | 0.82 | 0.8167 | 0.1437 | 0.0396 |
No log | 59.96 | 180 | 0.5796 | 0.815 | 0.2603 | 1.4021 | 0.815 | 0.8154 | 0.1286 | 0.0406 |
No log | 60.96 | 183 | 0.5711 | 0.83 | 0.2553 | 1.4016 | 0.83 | 0.8306 | 0.1272 | 0.0390 |
No log | 61.96 | 186 | 0.5670 | 0.825 | 0.2591 | 1.3136 | 0.825 | 0.8263 | 0.1429 | 0.0406 |
No log | 62.96 | 189 | 0.5736 | 0.825 | 0.2592 | 1.3077 | 0.825 | 0.8231 | 0.1244 | 0.0417 |
No log | 63.96 | 192 | 0.5730 | 0.83 | 0.2531 | 1.3007 | 0.83 | 0.8274 | 0.1275 | 0.0401 |
No log | 64.96 | 195 | 0.6130 | 0.82 | 0.2687 | 1.3014 | 0.82 | 0.8246 | 0.1484 | 0.0414 |
No log | 65.96 | 198 | 0.6023 | 0.825 | 0.2596 | 1.3107 | 0.825 | 0.8254 | 0.1373 | 0.0404 |
No log | 66.96 | 201 | 0.5923 | 0.825 | 0.2599 | 1.3078 | 0.825 | 0.8263 | 0.1312 | 0.0411 |
No log | 67.96 | 204 | 0.6197 | 0.81 | 0.2766 | 1.3046 | 0.81 | 0.8035 | 0.1373 | 0.0451 |
No log | 68.96 | 207 | 0.5918 | 0.805 | 0.2651 | 1.3019 | 0.805 | 0.8044 | 0.1407 | 0.0404 |
No log | 69.96 | 210 | 0.5908 | 0.835 | 0.2544 | 1.3286 | 0.835 | 0.8344 | 0.1354 | 0.0394 |
No log | 70.96 | 213 | 0.5941 | 0.83 | 0.2558 | 1.3019 | 0.83 | 0.8324 | 0.1402 | 0.0401 |
No log | 71.96 | 216 | 0.5994 | 0.82 | 0.2588 | 1.2998 | 0.82 | 0.8215 | 0.1297 | 0.0411 |
No log | 72.96 | 219 | 0.6083 | 0.825 | 0.2638 | 1.3525 | 0.825 | 0.8257 | 0.1379 | 0.0410 |
No log | 73.96 | 222 | 0.5980 | 0.825 | 0.2609 | 1.3515 | 0.825 | 0.8295 | 0.1457 | 0.0394 |
No log | 74.96 | 225 | 0.5945 | 0.83 | 0.2568 | 1.3670 | 0.83 | 0.8302 | 0.1324 | 0.0390 |
No log | 75.96 | 228 | 0.5982 | 0.845 | 0.2535 | 1.4552 | 0.845 | 0.8476 | 0.1246 | 0.0390 |
No log | 76.96 | 231 | 0.5850 | 0.83 | 0.2507 | 1.3700 | 0.83 | 0.8287 | 0.1348 | 0.0391 |
No log | 77.96 | 234 | 0.5859 | 0.825 | 0.2566 | 1.2917 | 0.825 | 0.8232 | 0.1309 | 0.0394 |
No log | 78.96 | 237 | 0.6085 | 0.835 | 0.2630 | 1.3516 | 0.835 | 0.8370 | 0.1329 | 0.0420 |
No log | 79.96 | 240 | 0.6108 | 0.835 | 0.2621 | 1.2943 | 0.835 | 0.8370 | 0.1395 | 0.0414 |
No log | 80.96 | 243 | 0.6061 | 0.81 | 0.2596 | 1.2898 | 0.81 | 0.8119 | 0.1313 | 0.0413 |
No log | 81.96 | 246 | 0.6006 | 0.815 | 0.2564 | 1.2952 | 0.815 | 0.8122 | 0.1453 | 0.0406 |
No log | 82.96 | 249 | 0.6050 | 0.825 | 0.2577 | 1.2998 | 0.825 | 0.8283 | 0.1271 | 0.0400 |
No log | 83.96 | 252 | 0.6197 | 0.835 | 0.2658 | 1.3021 | 0.835 | 0.8386 | 0.1222 | 0.0414 |
No log | 84.96 | 255 | 0.6086 | 0.825 | 0.2651 | 1.2889 | 0.825 | 0.8251 | 0.1207 | 0.0404 |
No log | 85.96 | 258 | 0.5965 | 0.83 | 0.2587 | 1.2929 | 0.83 | 0.8304 | 0.1323 | 0.0397 |
No log | 86.96 | 261 | 0.5897 | 0.82 | 0.2550 | 1.2980 | 0.82 | 0.8171 | 0.1372 | 0.0394 |
No log | 87.96 | 264 | 0.5887 | 0.83 | 0.2551 | 1.2950 | 0.83 | 0.8290 | 0.1251 | 0.0391 |
No log | 88.96 | 267 | 0.5958 | 0.82 | 0.2598 | 1.2871 | 0.82 | 0.8180 | 0.1319 | 0.0392 |
No log | 89.96 | 270 | 0.6088 | 0.82 | 0.2658 | 1.2805 | 0.82 | 0.8184 | 0.1513 | 0.0396 |
No log | 90.96 | 273 | 0.6192 | 0.825 | 0.2692 | 1.2772 | 0.825 | 0.8263 | 0.1258 | 0.0402 |
No log | 91.96 | 276 | 0.6230 | 0.825 | 0.2689 | 1.2777 | 0.825 | 0.8263 | 0.1416 | 0.0404 |
No log | 92.96 | 279 | 0.6223 | 0.83 | 0.2667 | 1.2792 | 0.83 | 0.8318 | 0.1296 | 0.0401 |
No log | 93.96 | 282 | 0.6145 | 0.83 | 0.2627 | 1.2797 | 0.83 | 0.8321 | 0.1265 | 0.0394 |
No log | 94.96 | 285 | 0.6105 | 0.83 | 0.2610 | 1.2807 | 0.83 | 0.8321 | 0.1352 | 0.0392 |
No log | 95.96 | 288 | 0.6095 | 0.83 | 0.2602 | 1.2815 | 0.83 | 0.8321 | 0.1360 | 0.0390 |
No log | 96.96 | 291 | 0.6076 | 0.835 | 0.2590 | 1.2824 | 0.835 | 0.8348 | 0.1255 | 0.0389 |
No log | 97.96 | 294 | 0.6060 | 0.835 | 0.2578 | 1.2827 | 0.835 | 0.8348 | 0.1281 | 0.0388 |
No log | 98.96 | 297 | 0.6058 | 0.835 | 0.2575 | 1.2825 | 0.835 | 0.8348 | 0.1410 | 0.0387 |
No log | 99.96 | 300 | 0.6059 | 0.835 | 0.2576 | 1.2824 | 0.835 | 0.8348 | 0.1310 | 0.0387 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
- Downloads last month
- 3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.