--- license: apache-2.0 library_name: peft tags: - generated_from_trainer base_model: facebook/deit-base-patch16-224 datasets: - medmnist-v2 metrics: - accuracy - precision - recall - f1 model-index: - name: organsmnist-vit-base-finetuned results: [] --- # organsmnist-vit-base-finetuned This model is a fine-tuned version of [facebook/deit-base-patch16-224](https://huggingface.co/facebook/deit-base-patch16-224) on the medmnist-v2 dataset. It achieves the following results on the evaluation set: - Loss: 0.2964 - Accuracy: 0.8993 - Precision: 0.8443 - Recall: 0.8396 - F1: 0.8394 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.9084 | 1.0 | 218 | 0.7151 | 0.7288 | 0.6998 | 0.6620 | 0.6412 | | 0.89 | 2.0 | 436 | 0.3658 | 0.8540 | 0.7873 | 0.7898 | 0.7660 | | 0.7851 | 3.0 | 654 | 0.3514 | 0.8438 | 0.8110 | 0.7674 | 0.7741 | | 0.7144 | 4.0 | 872 | 0.3632 | 0.8670 | 0.8415 | 0.8133 | 0.7980 | | 0.7383 | 5.0 | 1090 | 0.3680 | 0.8581 | 0.7769 | 0.8029 | 0.7786 | | 0.6065 | 6.0 | 1308 | 0.2824 | 0.8870 | 0.8481 | 0.8328 | 0.8305 | | 0.521 | 7.0 | 1526 | 0.2769 | 0.8940 | 0.8439 | 0.8404 | 0.8297 | | 0.5305 | 8.0 | 1744 | 0.2611 | 0.9001 | 0.8517 | 0.8463 | 0.8447 | | 0.4522 | 9.0 | 1962 | 0.2742 | 0.9058 | 0.8594 | 0.8517 | 0.8411 | | 0.4445 | 10.0 | 2180 | 0.2964 | 0.8993 | 0.8443 | 0.8396 | 0.8394 | ### Framework versions - PEFT 0.11.1 - Transformers 4.39.3 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.15.2