square_run_first_vote_full_pic_75
This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.7586
- F1 Macro: 0.4234
- F1 Micro: 0.5152
- F1 Weighted: 0.4789
- Precision Macro: 0.4562
- Precision Micro: 0.5152
- Precision Weighted: 0.5061
- Recall Macro: 0.4488
- Recall Micro: 0.5152
- Recall Weighted: 0.5152
- Accuracy: 0.5152
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.9675 | 1.0 | 58 | 1.9322 | 0.1051 | 0.1894 | 0.1204 | 0.1047 | 0.1894 | 0.1191 | 0.1661 | 0.1894 | 0.1894 | 0.1894 |
1.8921 | 2.0 | 116 | 1.9534 | 0.0786 | 0.1818 | 0.0855 | 0.0551 | 0.1818 | 0.0597 | 0.1656 | 0.1818 | 0.1818 | 0.1818 |
1.9081 | 3.0 | 174 | 1.8370 | 0.1526 | 0.2803 | 0.1976 | 0.1283 | 0.2803 | 0.1642 | 0.2117 | 0.2803 | 0.2803 | 0.2803 |
1.5193 | 4.0 | 232 | 1.7240 | 0.1963 | 0.3258 | 0.2445 | 0.2948 | 0.3258 | 0.3289 | 0.2476 | 0.3258 | 0.3258 | 0.3258 |
1.7743 | 5.0 | 290 | 1.5478 | 0.3382 | 0.4318 | 0.3920 | 0.3494 | 0.4318 | 0.4204 | 0.3837 | 0.4318 | 0.4318 | 0.4318 |
1.9879 | 6.0 | 348 | 1.5070 | 0.3157 | 0.4470 | 0.3865 | 0.4597 | 0.4470 | 0.5200 | 0.3499 | 0.4470 | 0.4470 | 0.4470 |
1.9096 | 7.0 | 406 | 1.4281 | 0.3859 | 0.4545 | 0.4410 | 0.4248 | 0.4545 | 0.4763 | 0.3931 | 0.4545 | 0.4545 | 0.4545 |
1.4577 | 8.0 | 464 | 1.4558 | 0.3862 | 0.4773 | 0.4381 | 0.3827 | 0.4773 | 0.4425 | 0.4346 | 0.4773 | 0.4773 | 0.4773 |
1.9664 | 9.0 | 522 | 1.5863 | 0.3757 | 0.4773 | 0.4227 | 0.3967 | 0.4773 | 0.4530 | 0.4288 | 0.4773 | 0.4773 | 0.4773 |
0.7655 | 10.0 | 580 | 1.3785 | 0.4015 | 0.5 | 0.4621 | 0.5175 | 0.5 | 0.5866 | 0.4427 | 0.5 | 0.5 | 0.5 |
0.707 | 11.0 | 638 | 1.3441 | 0.4772 | 0.5530 | 0.5356 | 0.4915 | 0.5530 | 0.5453 | 0.4861 | 0.5530 | 0.5530 | 0.5530 |
0.782 | 12.0 | 696 | 1.3983 | 0.4716 | 0.5530 | 0.5325 | 0.4860 | 0.5530 | 0.5432 | 0.4877 | 0.5530 | 0.5530 | 0.5530 |
0.7316 | 13.0 | 754 | 1.6155 | 0.4880 | 0.5530 | 0.5497 | 0.5085 | 0.5530 | 0.5892 | 0.5080 | 0.5530 | 0.5530 | 0.5530 |
1.0819 | 14.0 | 812 | 1.4869 | 0.4936 | 0.5379 | 0.5312 | 0.5124 | 0.5379 | 0.5370 | 0.4900 | 0.5379 | 0.5379 | 0.5379 |
0.8757 | 15.0 | 870 | 1.6936 | 0.4741 | 0.5303 | 0.5300 | 0.4809 | 0.5303 | 0.5481 | 0.4847 | 0.5303 | 0.5303 | 0.5303 |
0.7228 | 16.0 | 928 | 1.7370 | 0.4442 | 0.5227 | 0.4986 | 0.4401 | 0.5227 | 0.4939 | 0.4646 | 0.5227 | 0.5227 | 0.5227 |
0.3016 | 17.0 | 986 | 1.6977 | 0.5279 | 0.5682 | 0.5642 | 0.6353 | 0.5682 | 0.5994 | 0.5176 | 0.5682 | 0.5682 | 0.5682 |
0.2097 | 18.0 | 1044 | 1.9026 | 0.4769 | 0.5606 | 0.5414 | 0.5384 | 0.5606 | 0.5783 | 0.4819 | 0.5606 | 0.5606 | 0.5606 |
0.0388 | 19.0 | 1102 | 1.8276 | 0.5259 | 0.6136 | 0.5981 | 0.5252 | 0.6136 | 0.5945 | 0.5382 | 0.6136 | 0.6136 | 0.6136 |
0.4837 | 20.0 | 1160 | 1.8658 | 0.5336 | 0.5985 | 0.5863 | 0.5502 | 0.5985 | 0.5866 | 0.5342 | 0.5985 | 0.5985 | 0.5985 |
0.1531 | 21.0 | 1218 | 2.0415 | 0.4703 | 0.5606 | 0.5384 | 0.4917 | 0.5606 | 0.5489 | 0.4762 | 0.5606 | 0.5606 | 0.5606 |
0.0142 | 22.0 | 1276 | 2.0812 | 0.4969 | 0.5303 | 0.5260 | 0.5067 | 0.5303 | 0.5364 | 0.5008 | 0.5303 | 0.5303 | 0.5303 |
0.0036 | 23.0 | 1334 | 2.0662 | 0.5315 | 0.5758 | 0.5781 | 0.5480 | 0.5758 | 0.5925 | 0.5316 | 0.5758 | 0.5758 | 0.5758 |
0.0065 | 24.0 | 1392 | 2.1023 | 0.5090 | 0.5606 | 0.5516 | 0.5140 | 0.5606 | 0.5550 | 0.5154 | 0.5606 | 0.5606 | 0.5606 |
0.1359 | 25.0 | 1450 | 2.0555 | 0.4994 | 0.5455 | 0.5440 | 0.5018 | 0.5455 | 0.5474 | 0.5021 | 0.5455 | 0.5455 | 0.5455 |
0.0037 | 26.0 | 1508 | 2.1745 | 0.5206 | 0.5758 | 0.5691 | 0.5289 | 0.5758 | 0.5695 | 0.5204 | 0.5758 | 0.5758 | 0.5758 |
0.0391 | 27.0 | 1566 | 2.2087 | 0.5204 | 0.5758 | 0.5676 | 0.5335 | 0.5758 | 0.5745 | 0.5228 | 0.5758 | 0.5758 | 0.5758 |
0.0017 | 28.0 | 1624 | 2.1219 | 0.5178 | 0.5682 | 0.5633 | 0.5218 | 0.5682 | 0.5649 | 0.5212 | 0.5682 | 0.5682 | 0.5682 |
0.0015 | 29.0 | 1682 | 2.1455 | 0.5198 | 0.5682 | 0.5618 | 0.5342 | 0.5682 | 0.5641 | 0.5190 | 0.5682 | 0.5682 | 0.5682 |
0.0015 | 30.0 | 1740 | 2.1308 | 0.5192 | 0.5682 | 0.5617 | 0.5315 | 0.5682 | 0.5621 | 0.5190 | 0.5682 | 0.5682 | 0.5682 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for corranm/square_run_first_vote_full_pic_75
Base model
google/vit-base-patch16-224