swinv2-tiny-patch4-window8-256-DMAE-da3-colab

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3992
  • Accuracy: 0.3696

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.3523 0.9778 22 1.4024 0.3261
1.3805 2.0 45 1.3775 0.2609
1.3221 2.9778 67 1.4419 0.3043
1.297 4.0 90 1.3582 0.3261
1.353 4.9778 112 1.3406 0.3478
1.2627 6.0 135 1.3824 0.1522
1.3006 6.9778 157 1.4008 0.1522
1.2438 8.0 180 1.3769 0.3261
1.222 8.9778 202 1.4212 0.3043
1.2221 10.0 225 1.4223 0.2391
1.2262 10.9778 247 1.4154 0.2609
1.2381 12.0 270 1.3327 0.2391
1.227 12.9778 292 1.2887 0.2826
1.2158 14.0 315 1.3465 0.2609
1.2174 14.9778 337 1.3476 0.3043
1.1767 16.0 360 1.4024 0.1957
1.2067 16.9778 382 1.3664 0.1739
1.2303 18.0 405 1.4260 0.2826
1.222 18.9778 427 1.4807 0.1739
1.2026 20.0 450 1.3851 0.1739
1.2185 20.9778 472 1.3214 0.2609
1.2773 22.0 495 1.4404 0.1957
1.227 22.9778 517 1.4535 0.2391
1.2032 24.0 540 1.3967 0.3043
1.2223 24.9778 562 1.4090 0.3261
1.2527 26.0 585 1.4858 0.2609
1.2203 26.9778 607 1.4366 0.1739
1.1993 28.0 630 1.4056 0.2609
1.2014 28.9778 652 1.3755 0.3043
1.2027 30.0 675 1.4579 0.2609
1.1961 30.9778 697 1.4524 0.2609
1.1939 32.0 720 1.4488 0.2391
1.1889 32.9778 742 1.4568 0.1522
1.1871 34.0 765 1.3814 0.3261
1.1778 34.9778 787 1.4403 0.1304
1.2404 36.0 810 1.4437 0.1957
1.197 36.9778 832 1.4765 0.2174
1.2161 38.0 855 1.3720 0.2391
1.221 38.9778 877 1.3750 0.3478
1.229 40.0 900 1.3405 0.2391
1.2046 40.9778 922 1.4231 0.2609
1.2077 42.0 945 1.4384 0.2391
1.1865 42.9778 967 1.4346 0.2609
1.1882 44.0 990 1.3679 0.2826
1.2528 44.9778 1012 1.3451 0.2174
1.1836 46.0 1035 1.4913 0.2391
1.2009 46.9778 1057 1.4841 0.3261
1.203 48.0 1080 1.4326 0.3043
1.1679 48.9778 1102 1.3935 0.3043
1.179 50.0 1125 1.4185 0.1957
1.1687 50.9778 1147 1.3686 0.2826
1.1779 52.0 1170 1.4319 0.1957
1.1566 52.9778 1192 1.3801 0.1957
1.192 54.0 1215 1.3746 0.2174
1.1803 54.9778 1237 1.4017 0.1957
1.194 56.0 1260 1.4288 0.1957
1.1486 56.9778 1282 1.3920 0.3043
1.1429 58.0 1305 1.4616 0.2391
1.1655 58.9778 1327 1.4119 0.2174
1.1697 60.0 1350 1.3812 0.2609
1.1898 60.9778 1372 1.4009 0.2391
1.1882 62.0 1395 1.4221 0.2391
1.134 62.9778 1417 1.6190 0.2609
1.1748 64.0 1440 1.4336 0.2391
1.1439 64.9778 1462 1.3744 0.1957
1.1585 66.0 1485 1.3992 0.3696
1.1344 66.9778 1507 1.3952 0.2391
1.1374 68.0 1530 1.3666 0.2174
1.1252 68.9778 1552 1.3705 0.2826
1.1339 70.0 1575 1.3983 0.2826
1.1344 70.9778 1597 1.3792 0.3043
1.1343 72.0 1620 1.4467 0.2826
1.1555 72.9778 1642 1.4823 0.2174
1.1329 74.0 1665 1.5136 0.1522
1.1513 74.9778 1687 1.4791 0.2391
1.1278 76.0 1710 1.4527 0.2609
1.0956 76.9778 1732 1.4840 0.2391
1.1131 78.0 1755 1.4900 0.2174
1.1376 78.9778 1777 1.5395 0.2174
1.0883 80.0 1800 1.5038 0.1957
1.1017 80.9778 1822 1.5392 0.1957
1.1608 82.0 1845 1.4875 0.2174
1.1308 82.9778 1867 1.5080 0.1957
1.1382 84.0 1890 1.4835 0.1739
1.1195 84.9778 1912 1.4076 0.1957
1.1149 86.0 1935 1.4840 0.1739
1.1344 86.9778 1957 1.4733 0.1957
1.1268 88.0 1980 1.4446 0.2391
1.1267 88.9778 2002 1.4360 0.2174
1.1034 90.0 2025 1.4329 0.1522
1.1113 90.9778 2047 1.4670 0.1739
1.0957 92.0 2070 1.4802 0.2391
1.1227 92.9778 2092 1.4715 0.1739
1.1083 94.0 2115 1.4813 0.1957
1.0583 94.9778 2137 1.5203 0.1957
1.093 96.0 2160 1.5394 0.1739
1.0809 96.9778 2182 1.4620 0.1739
1.0888 98.0 2205 1.4407 0.1739
1.1292 98.9778 2227 1.4578 0.1957
1.0754 100.0 2250 1.5031 0.1739
1.0817 100.9778 2272 1.4461 0.2174
1.0671 102.0 2295 1.4723 0.2391
1.0815 102.9778 2317 1.4989 0.1957
1.0967 104.0 2340 1.4654 0.2174
1.091 104.9778 2362 1.4559 0.2174
1.0895 106.0 2385 1.4221 0.2826
1.0847 106.9778 2407 1.4293 0.2826
1.102 108.0 2430 1.4582 0.2391
1.0404 108.9778 2452 1.4656 0.2174
1.0488 110.0 2475 1.4890 0.2174
1.0966 110.9778 2497 1.4632 0.2174
1.0901 112.0 2520 1.4495 0.2174
1.1008 112.9778 2542 1.4333 0.2174
1.0884 114.0 2565 1.4406 0.2174
1.0889 114.9778 2587 1.4474 0.2174
1.0729 116.0 2610 1.4561 0.2174
1.0671 116.9778 2632 1.4538 0.2174
1.0937 117.3333 2640 1.4532 0.2174

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
17
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/swinv2-tiny-patch4-window8-256-DMAE-da3-colab

Finetuned
(86)
this model

Evaluation results