Edit model card

swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-stacked_auc

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1284
  • Accuracy: 0.9656

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.452 1.0 53 1.2923 0.5028
0.7902 2.0 106 0.6596 0.7262
0.6011 3.0 159 0.5092 0.7889
0.5202 4.0 212 0.4173 0.8361
0.4687 5.0 265 0.3498 0.8574
0.422 6.0 318 0.3566 0.8633
0.3678 7.0 371 0.3017 0.8877
0.3414 8.0 424 0.2601 0.9050
0.3249 9.0 477 0.2777 0.8884
0.3136 10.0 530 0.2625 0.9063
0.3163 11.0 583 0.2504 0.9063
0.3144 12.0 636 0.2146 0.9215
0.3062 13.0 689 0.2225 0.9198
0.2574 14.0 742 0.1928 0.9304
0.2487 15.0 795 0.1974 0.9236
0.2022 16.0 848 0.2147 0.9215
0.198 17.0 901 0.1708 0.9394
0.2174 18.0 954 0.2144 0.9156
0.225 19.0 1007 0.1866 0.9360
0.1886 20.0 1060 0.1512 0.9439
0.1797 21.0 1113 0.1520 0.9473
0.1779 22.0 1166 0.1944 0.9294
0.2006 23.0 1219 0.1722 0.9404
0.1647 24.0 1272 0.1401 0.9487
0.1766 25.0 1325 0.1587 0.9449
0.1347 26.0 1378 0.1525 0.9504
0.1533 27.0 1431 0.1336 0.9528
0.1322 28.0 1484 0.2079 0.9329
0.1291 29.0 1537 0.1421 0.9518
0.1397 30.0 1590 0.1457 0.9497
0.1189 31.0 1643 0.1530 0.9521
0.1404 32.0 1696 0.1818 0.9332
0.1431 33.0 1749 0.1486 0.9487
0.1214 34.0 1802 0.1555 0.9525
0.1195 35.0 1855 0.1852 0.9439
0.1161 36.0 1908 0.1670 0.9439
0.1052 37.0 1961 0.1551 0.9504
0.1004 38.0 2014 0.1535 0.9511
0.113 39.0 2067 0.1308 0.9514
0.114 40.0 2120 0.1752 0.9463
0.0807 41.0 2173 0.1467 0.9528
0.1044 42.0 2226 0.1289 0.9604
0.1118 43.0 2279 0.1602 0.9518
0.1305 44.0 2332 0.1699 0.9452
0.083 45.0 2385 0.1376 0.9563
0.1153 46.0 2438 0.1272 0.9594
0.0875 47.0 2491 0.1358 0.9559
0.0772 48.0 2544 0.1662 0.9501
0.084 49.0 2597 0.1456 0.9580
0.082 50.0 2650 0.1593 0.9483
0.0919 51.0 2703 0.1638 0.9483
0.0999 52.0 2756 0.1420 0.9532
0.0718 53.0 2809 0.1447 0.9549
0.0757 54.0 2862 0.1791 0.9490
0.0632 55.0 2915 0.1364 0.9604
0.0922 56.0 2968 0.1544 0.9525
0.0805 57.0 3021 0.1493 0.9552
0.0702 58.0 3074 0.1307 0.9570
0.0554 59.0 3127 0.1502 0.9532
0.0699 60.0 3180 0.1340 0.9590
0.0759 61.0 3233 0.1353 0.9576
0.0604 62.0 3286 0.1441 0.9570
0.0642 63.0 3339 0.1312 0.9601
0.0577 64.0 3392 0.1399 0.9597
0.0506 65.0 3445 0.1347 0.9594
0.0781 66.0 3498 0.1403 0.9601
0.0664 67.0 3551 0.1379 0.9587
0.0775 68.0 3604 0.1389 0.9573
0.0578 69.0 3657 0.1360 0.9570
0.0782 70.0 3710 0.1317 0.9580
0.0474 71.0 3763 0.1446 0.9594
0.0357 72.0 3816 0.1359 0.9618
0.0472 73.0 3869 0.1429 0.9590
0.071 74.0 3922 0.1333 0.9604
0.0663 75.0 3975 0.1327 0.9597
0.0536 76.0 4028 0.1396 0.9587
0.0549 77.0 4081 0.1392 0.9597
0.0621 78.0 4134 0.1408 0.9645
0.0531 79.0 4187 0.1406 0.9607
0.0464 80.0 4240 0.1463 0.9594
0.0526 81.0 4293 0.1355 0.9638
0.0277 82.0 4346 0.1464 0.9635
0.0558 83.0 4399 0.1487 0.9604
0.0466 84.0 4452 0.1319 0.9642
0.0463 85.0 4505 0.1443 0.9621
0.0397 86.0 4558 0.1494 0.9611
0.0489 87.0 4611 0.1428 0.9642
0.0354 88.0 4664 0.1387 0.9645
0.0457 89.0 4717 0.1362 0.9638
0.0522 90.0 4770 0.1332 0.9656
0.0481 91.0 4823 0.1352 0.9642
0.0472 92.0 4876 0.1375 0.9673
0.0362 93.0 4929 0.1354 0.9656
0.0432 94.0 4982 0.1306 0.9632
0.037 95.0 5035 0.1283 0.9663
0.0525 96.0 5088 0.1273 0.9666
0.0349 97.0 5141 0.1279 0.9659
0.0411 98.0 5194 0.1279 0.9659
0.044 99.0 5247 0.1283 0.9659
0.0289 100.0 5300 0.1284 0.9656

Framework versions

  • Transformers 4.44.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-stacked_auc

Finetuned
(447)
this model

Evaluation results