Edit model card

swin-tiny-patch4-window7-224-ve-U13-b-80

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9190
  • Accuracy: 0.8043

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 80

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 6 1.3859 0.1304
1.3859 2.0 13 1.3828 0.2826
1.3859 2.92 19 1.3769 0.3261
1.379 4.0 26 1.3566 0.2826
1.3356 4.92 32 1.3162 0.2391
1.3356 6.0 39 1.2093 0.3478
1.2023 6.92 45 1.1349 0.4565
1.0274 8.0 52 1.0414 0.4783
1.0274 8.92 58 0.9788 0.5217
0.9125 10.0 65 1.0071 0.4348
0.7688 10.92 71 1.0416 0.5217
0.7688 12.0 78 1.0480 0.4130
0.6891 12.92 84 0.9351 0.5870
0.5795 14.0 91 1.0683 0.6304
0.5795 14.92 97 1.0698 0.6087
0.5337 16.0 104 0.9603 0.6304
0.4337 16.92 110 0.7188 0.6957
0.4337 18.0 117 0.7620 0.6739
0.4258 18.92 123 0.9433 0.6739
0.4045 20.0 130 1.0823 0.6522
0.4045 20.92 136 0.7059 0.7174
0.4135 22.0 143 0.7467 0.7391
0.4135 22.92 149 0.7637 0.7391
0.3525 24.0 156 0.8157 0.7391
0.263 24.92 162 0.9995 0.7174
0.263 26.0 169 0.8719 0.7609
0.272 26.92 175 0.9939 0.6957
0.262 28.0 182 0.8639 0.7174
0.262 28.92 188 1.0737 0.6522
0.2282 30.0 195 0.8416 0.7174
0.2098 30.92 201 0.9744 0.6739
0.2098 32.0 208 1.0593 0.6087
0.2141 32.92 214 1.0997 0.7174
0.1759 34.0 221 0.9735 0.5870
0.1759 34.92 227 1.0789 0.6957
0.2042 36.0 234 1.0664 0.6957
0.1591 36.92 240 0.9417 0.7609
0.1591 38.0 247 1.1042 0.6739
0.1579 38.92 253 0.9732 0.7609
0.1626 40.0 260 0.9960 0.6957
0.1626 40.92 266 0.9763 0.7391
0.1458 42.0 273 0.9790 0.7391
0.1458 42.92 279 1.0952 0.7174
0.1317 44.0 286 0.9190 0.8043
0.1255 44.92 292 0.9420 0.7391
0.1255 46.0 299 0.9085 0.7391
0.1352 46.92 305 0.9184 0.7174
0.1311 48.0 312 1.0567 0.7609
0.1311 48.92 318 1.1507 0.7174
0.1501 50.0 325 1.2068 0.7174
0.1088 50.92 331 1.4607 0.6957
0.1088 52.0 338 1.1036 0.6739
0.1152 52.92 344 1.1081 0.6957
0.1141 54.0 351 1.1006 0.6957
0.1141 54.92 357 1.1470 0.7174
0.1307 56.0 364 1.0715 0.7609
0.1273 56.92 370 1.1021 0.7174
0.1273 58.0 377 1.1176 0.6957
0.1066 58.92 383 1.0948 0.7174
0.1046 60.0 390 1.0563 0.7391
0.1046 60.92 396 1.1155 0.6957
0.1129 62.0 403 1.0922 0.6957
0.1129 62.92 409 1.0364 0.6957
0.1031 64.0 416 1.0675 0.7174
0.0808 64.92 422 1.1133 0.6957
0.0808 66.0 429 1.2029 0.7174
0.0783 66.92 435 1.1453 0.7174
0.09 68.0 442 1.0925 0.6957
0.09 68.92 448 1.0999 0.7174
0.0796 70.0 455 1.0971 0.7391
0.0828 70.92 461 1.0923 0.7391
0.0828 72.0 468 1.1061 0.7391
0.0923 72.92 474 1.1173 0.7391
0.092 73.85 480 1.1208 0.7391

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
13
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/swin-tiny-patch4-window7-224-ve-U13-b-80

Finetuned
(471)
this model

Evaluation results