--- license: apache-2.0 base_model: microsoft/swinv2-tiny-patch4-window8-256 tags: - image-classification - vision - generated_from_trainer metrics: - accuracy model-index: - name: swinv2-tiny-patch4-window8-256-finetuned-galaxy10-decals results: [] --- # swinv2-tiny-patch4-window8-256-finetuned-galaxy10-decals This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on the matthieulel/galaxy10_decals dataset. It achieves the following results on the evaluation set: - Loss: 0.4357 - Accuracy: 0.8585 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | 1.318 | 0.9940 | 124 | 1.0409 | 0.6359 | | 0.9268 | 1.9960 | 249 | 0.7164 | 0.7497 | | 0.8221 | 2.9980 | 374 | 0.6210 | 0.7875 | | 0.7276 | 4.0 | 499 | 0.5564 | 0.8162 | | 0.6425 | 4.9940 | 623 | 0.5226 | 0.8162 | | 0.6518 | 5.9960 | 748 | 0.5377 | 0.8185 | | 0.6096 | 6.9980 | 873 | 0.5341 | 0.8219 | | 0.6282 | 8.0 | 998 | 0.4718 | 0.8399 | | 0.5394 | 8.9940 | 1122 | 0.5113 | 0.8281 | | 0.5718 | 9.9960 | 1247 | 0.5019 | 0.8292 | | 0.5507 | 10.9980 | 1372 | 0.4545 | 0.8461 | | 0.4921 | 12.0 | 1497 | 0.4613 | 0.8416 | | 0.5571 | 12.9940 | 1621 | 0.4587 | 0.8416 | | 0.512 | 13.9960 | 1746 | 0.4673 | 0.8512 | | 0.4855 | 14.9980 | 1871 | 0.4641 | 0.8489 | | 0.4895 | 16.0 | 1996 | 0.4556 | 0.8450 | | 0.4809 | 16.9940 | 2120 | 0.4317 | 0.8523 | | 0.4785 | 17.9960 | 2245 | 0.4338 | 0.8534 | | 0.444 | 18.9980 | 2370 | 0.4357 | 0.8579 | | 0.4255 | 19.8798 | 2480 | 0.4357 | 0.8585 | ### Framework versions - Transformers 4.40.1 - Pytorch 1.12.1+cu116 - Datasets 2.19.0 - Tokenizers 0.19.1