TriDat's picture
trantridat/SwinBase_lora r64 alpha16 dropout0.05 batchsize64 lr0.001
6ff695a verified
metadata
license: apache-2.0
base_model: microsoft/swin-base-patch4-window7-224-in22k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-base-patch4-window7-224-in22k-finetuned-lora-ISIC-2019
    results: []

swin-base-patch4-window7-224-in22k-finetuned-lora-ISIC-2019

This model is a fine-tuned version of microsoft/swin-base-patch4-window7-224-in22k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4229
  • Accuracy: 0.9008

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.858 0.99 62 0.7349 0.7339
0.7403 2.0 125 0.6364 0.7762
0.675 2.99 187 0.5777 0.7999
0.6309 4.0 250 0.5701 0.7875
0.5734 4.99 312 0.5294 0.8016
0.5338 6.0 375 0.5418 0.8010
0.5104 6.99 437 0.5057 0.8179
0.5091 8.0 500 0.5010 0.8207
0.4678 8.99 562 0.4757 0.8247
0.467 10.0 625 0.4579 0.8151
0.4416 10.99 687 0.4650 0.8315
0.4277 12.0 750 0.4405 0.8405
0.4261 12.99 812 0.4414 0.8388
0.4016 14.0 875 0.4392 0.8286
0.3729 14.99 937 0.4471 0.8281
0.3813 16.0 1000 0.4155 0.8433
0.3454 16.99 1062 0.4322 0.8365
0.3639 18.0 1125 0.4332 0.8360
0.3393 18.99 1187 0.4190 0.8523
0.3135 20.0 1250 0.4166 0.8534
0.3094 20.99 1312 0.4005 0.8563
0.3263 22.0 1375 0.4399 0.8495
0.3009 22.99 1437 0.4122 0.8523
0.2804 24.0 1500 0.4293 0.8563
0.2516 24.99 1562 0.4289 0.8563
0.2763 26.0 1625 0.4125 0.8647
0.2707 26.99 1687 0.4231 0.8664
0.2585 28.0 1750 0.4210 0.8596
0.2317 28.99 1812 0.4296 0.8602
0.2118 30.0 1875 0.4440 0.8636
0.2224 30.99 1937 0.3928 0.8726
0.2166 32.0 2000 0.4246 0.8602
0.2038 32.99 2062 0.4146 0.8709
0.2183 34.0 2125 0.4165 0.8698
0.22 34.99 2187 0.4212 0.8766
0.206 36.0 2250 0.4139 0.8726
0.199 36.99 2312 0.3793 0.8833
0.1926 38.0 2375 0.4127 0.8839
0.1648 38.99 2437 0.4296 0.8822
0.1578 40.0 2500 0.4132 0.8833
0.181 40.99 2562 0.4217 0.8777
0.1735 42.0 2625 0.4186 0.8715
0.1603 42.99 2687 0.4117 0.8805
0.1516 44.0 2750 0.4250 0.8816
0.1733 44.99 2812 0.3914 0.8844
0.164 46.0 2875 0.4369 0.8828
0.1519 46.99 2937 0.4276 0.8771
0.1534 48.0 3000 0.4421 0.8822
0.158 48.99 3062 0.4240 0.8873
0.1531 50.0 3125 0.4250 0.8794
0.1286 50.99 3187 0.4228 0.8732
0.1396 52.0 3250 0.4317 0.8782
0.1436 52.99 3312 0.4361 0.8856
0.1411 54.0 3375 0.4402 0.8850
0.1312 54.99 3437 0.4327 0.8884
0.1359 56.0 3500 0.4144 0.8856
0.1361 56.99 3562 0.4181 0.8867
0.1272 58.0 3625 0.4204 0.8878
0.1222 58.99 3687 0.4137 0.8884
0.1272 60.0 3750 0.4317 0.8890
0.1132 60.99 3812 0.4351 0.8918
0.1239 62.0 3875 0.4348 0.8828
0.1188 62.99 3937 0.4258 0.8861
0.1203 64.0 4000 0.4318 0.8912
0.1204 64.99 4062 0.4055 0.8952
0.1053 66.0 4125 0.4222 0.8918
0.1187 66.99 4187 0.4248 0.8946
0.1129 68.0 4250 0.4302 0.8923
0.1117 68.99 4312 0.4149 0.8968
0.1194 70.0 4375 0.4160 0.8895
0.1003 70.99 4437 0.4256 0.8946
0.1088 72.0 4500 0.4356 0.8918
0.11 72.99 4562 0.4277 0.8935
0.1016 74.0 4625 0.4095 0.8952
0.0906 74.99 4687 0.4262 0.8935
0.0969 76.0 4750 0.4057 0.8940
0.111 76.99 4812 0.4099 0.8997
0.091 78.0 4875 0.4232 0.8963
0.1013 78.99 4937 0.4311 0.8884
0.119 80.0 5000 0.4302 0.8929
0.0877 80.99 5062 0.4369 0.8923
0.0926 82.0 5125 0.4353 0.8968
0.0969 82.99 5187 0.4336 0.8952
0.092 84.0 5250 0.4214 0.8935
0.0914 84.99 5312 0.4403 0.8890
0.0924 86.0 5375 0.4285 0.8929
0.0964 86.99 5437 0.4207 0.8968
0.0916 88.0 5500 0.4254 0.8946
0.0962 88.99 5562 0.4249 0.8980
0.0927 90.0 5625 0.4242 0.8935
0.0993 90.99 5687 0.4230 0.8985
0.0893 92.0 5750 0.4229 0.8980
0.0878 92.99 5812 0.4215 0.8985
0.0882 94.0 5875 0.4262 0.8980
0.0854 94.99 5937 0.4256 0.8974
0.0795 96.0 6000 0.4229 0.9008
0.0931 96.99 6062 0.4218 0.8991
0.0826 98.0 6125 0.4235 0.8985
0.0926 98.99 6187 0.4237 0.8985
0.0829 99.2 6200 0.4238 0.8985

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.0.1
  • Datasets 2.12.0
  • Tokenizers 0.13.2