vit-base-tour-augmentation-v5
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the beans dataset. It achieves the following results on the evaluation set:
- Loss: 2.4590
- Acc: {'accuracy': 0.42896389324960754}
- F1: {'f1': 0.4271599947085357}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Acc | F1 |
---|---|---|---|---|---|
1.634 | 0.77 | 1000 | 2.4590 | {'accuracy': 0.42896389324960754} | {'f1': 0.4271599947085357} |
0.8261 | 1.53 | 2000 | 2.4812 | {'accuracy': 0.4703689167974882} | {'f1': 0.4487847422578378} |
0.3823 | 2.3 | 3000 | 2.6315 | {'accuracy': 0.46683673469387754} | {'f1': 0.45668802662482005} |
0.1652 | 3.07 | 4000 | 2.8592 | {'accuracy': 0.46703296703296704} | {'f1': 0.4525782242029193} |
0.0713 | 3.83 | 5000 | 3.0906 | {'accuracy': 0.4430926216640502} | {'f1': 0.45342569349779865} |
0.0354 | 4.6 | 6000 | 3.2511 | {'accuracy': 0.45506279434850866} | {'f1': 0.44957410984221347} |
0.0214 | 5.36 | 7000 | 3.3369 | {'accuracy': 0.47370486656200944} | {'f1': 0.4603765751991713} |
0.0129 | 6.13 | 8000 | 3.4611 | {'accuracy': 0.4619309262166405} | {'f1': 0.4624748087985776} |
0.0079 | 6.9 | 9000 | 3.5376 | {'accuracy': 0.46251962323390894} | {'f1': 0.4584329789658534} |
0.0058 | 7.66 | 10000 | 3.5842 | {'accuracy': 0.4705651491365777} | {'f1': 0.46144792853832145} |
Framework versions
- Transformers 4.22.2
- Pytorch 1.12.1+cu113
- Datasets 2.5.2
- Tokenizers 0.12.1
- Downloads last month
- 210
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.