dwiedarioo/vit-base-patch16-224-in21k-datascience8
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0061
- Train Accuracy: 1.0
- Train Top-3-accuracy: 1.0
- Validation Loss: 0.1289
- Validation Accuracy: 0.9633
- Validation Top-3-accuracy: 0.9935
- Epoch: 53
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'inner_optimizer': {'module': 'transformers.optimization_tf', 'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 8200, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.8999999761581421, 'beta_2': 0.9990000128746033, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}, 'registered_name': 'AdamWeightDecay'}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
- training_precision: mixed_float16
Training results
Train Loss | Train Accuracy | Train Top-3-accuracy | Validation Loss | Validation Accuracy | Validation Top-3-accuracy | Epoch |
---|---|---|---|---|---|---|
2.2657 | 0.4219 | 0.6250 | 1.9041 | 0.5875 | 0.8121 | 0 |
1.5469 | 0.7006 | 0.8771 | 1.3444 | 0.7322 | 0.9136 | 1 |
1.0263 | 0.8519 | 0.9553 | 0.9408 | 0.8769 | 0.9719 | 2 |
0.6814 | 0.9412 | 0.9893 | 0.6752 | 0.9244 | 0.9827 | 3 |
0.4663 | 0.9779 | 0.9966 | 0.5106 | 0.9460 | 0.9935 | 4 |
0.3372 | 0.9927 | 0.9981 | 0.4127 | 0.9503 | 0.9892 | 5 |
0.2526 | 0.9958 | 0.9989 | 0.3468 | 0.9546 | 0.9914 | 6 |
0.2015 | 0.9973 | 1.0 | 0.3072 | 0.9568 | 0.9914 | 7 |
0.1663 | 0.9981 | 1.0 | 0.2609 | 0.9611 | 0.9935 | 8 |
0.1391 | 0.9989 | 0.9996 | 0.2353 | 0.9654 | 0.9957 | 9 |
0.1186 | 0.9992 | 1.0 | 0.2889 | 0.9438 | 0.9914 | 10 |
0.1201 | 0.9954 | 0.9996 | 0.3820 | 0.9006 | 0.9762 | 11 |
0.1402 | 0.9905 | 1.0 | 0.2185 | 0.9546 | 0.9892 | 12 |
0.0812 | 1.0 | 1.0 | 0.1898 | 0.9590 | 0.9914 | 13 |
0.0697 | 1.0 | 1.0 | 0.1757 | 0.9611 | 0.9935 | 14 |
0.0618 | 1.0 | 1.0 | 0.1698 | 0.9611 | 0.9914 | 15 |
0.0554 | 1.0 | 1.0 | 0.1625 | 0.9611 | 0.9935 | 16 |
0.0500 | 1.0 | 1.0 | 0.1592 | 0.9611 | 0.9935 | 17 |
0.0454 | 1.0 | 1.0 | 0.1526 | 0.9611 | 0.9935 | 18 |
0.0415 | 1.0 | 1.0 | 0.1494 | 0.9611 | 0.9935 | 19 |
0.0380 | 1.0 | 1.0 | 0.1473 | 0.9590 | 0.9935 | 20 |
0.0350 | 1.0 | 1.0 | 0.1443 | 0.9590 | 0.9935 | 21 |
0.0323 | 1.0 | 1.0 | 0.1403 | 0.9611 | 0.9935 | 22 |
0.0299 | 1.0 | 1.0 | 0.1408 | 0.9590 | 0.9935 | 23 |
0.0277 | 1.0 | 1.0 | 0.1368 | 0.9590 | 0.9935 | 24 |
0.0258 | 1.0 | 1.0 | 0.1369 | 0.9611 | 0.9935 | 25 |
0.0241 | 1.0 | 1.0 | 0.1361 | 0.9590 | 0.9935 | 26 |
0.0225 | 1.0 | 1.0 | 0.1355 | 0.9590 | 0.9935 | 27 |
0.0211 | 1.0 | 1.0 | 0.1349 | 0.9611 | 0.9935 | 28 |
0.0197 | 1.0 | 1.0 | 0.1312 | 0.9590 | 0.9935 | 29 |
0.0185 | 1.0 | 1.0 | 0.1317 | 0.9590 | 0.9935 | 30 |
0.0175 | 1.0 | 1.0 | 0.1328 | 0.9611 | 0.9935 | 31 |
0.0165 | 1.0 | 1.0 | 0.1318 | 0.9611 | 0.9935 | 32 |
0.0155 | 1.0 | 1.0 | 0.1320 | 0.9611 | 0.9935 | 33 |
0.0147 | 1.0 | 1.0 | 0.1294 | 0.9611 | 0.9935 | 34 |
0.0139 | 1.0 | 1.0 | 0.1306 | 0.9611 | 0.9935 | 35 |
0.0132 | 1.0 | 1.0 | 0.1291 | 0.9611 | 0.9935 | 36 |
0.0125 | 1.0 | 1.0 | 0.1295 | 0.9611 | 0.9935 | 37 |
0.0119 | 1.0 | 1.0 | 0.1306 | 0.9611 | 0.9935 | 38 |
0.0113 | 1.0 | 1.0 | 0.1275 | 0.9633 | 0.9935 | 39 |
0.0107 | 1.0 | 1.0 | 0.1282 | 0.9633 | 0.9935 | 40 |
0.0102 | 1.0 | 1.0 | 0.1272 | 0.9633 | 0.9935 | 41 |
0.0097 | 1.0 | 1.0 | 0.1282 | 0.9633 | 0.9935 | 42 |
0.0093 | 1.0 | 1.0 | 0.1269 | 0.9633 | 0.9935 | 43 |
0.0089 | 1.0 | 1.0 | 0.1286 | 0.9633 | 0.9935 | 44 |
0.0085 | 1.0 | 1.0 | 0.1278 | 0.9633 | 0.9935 | 45 |
0.0081 | 1.0 | 1.0 | 0.1285 | 0.9633 | 0.9935 | 46 |
0.0078 | 1.0 | 1.0 | 0.1291 | 0.9633 | 0.9935 | 47 |
0.0074 | 1.0 | 1.0 | 0.1290 | 0.9633 | 0.9935 | 48 |
0.0071 | 1.0 | 1.0 | 0.1283 | 0.9633 | 0.9935 | 49 |
0.0068 | 1.0 | 1.0 | 0.1292 | 0.9633 | 0.9935 | 50 |
0.0066 | 1.0 | 1.0 | 0.1295 | 0.9633 | 0.9935 | 51 |
0.0063 | 1.0 | 1.0 | 0.1290 | 0.9633 | 0.9935 | 52 |
0.0061 | 1.0 | 1.0 | 0.1289 | 0.9633 | 0.9935 | 53 |
Framework versions
- Transformers 4.35.2
- TensorFlow 2.14.0
- Tokenizers 0.15.0
- Downloads last month
- 12
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for dwiedarioo/vit-base-patch16-224-in21k-datascience8
Base model
google/vit-base-patch16-224-in21k