metadata
library_name: transformers
license: apache-2.0
base_model: facebook/dinov2-large
tags:
- generated_from_trainer
model-index:
- name: Ziboiai-large-2024_10_31-prova_batch-size32_freeze_probs
results: []
Ziboiai-large-2024_10_31-prova_batch-size32_freeze_probs
This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6195
- Rmse: 0.3419
- Mae: 0.3068
- R2: -1.6131
- Explained Variance: 0.2071
- Learning Rate: 1e-05
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rmse | Mae | R2 | Explained Variance | Rate |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 2 | 0.7150 | 0.4100 | 0.3849 | -20.2909 | 0.0364 | 0.001 |
No log | 2.0 | 4 | 0.7314 | 0.4163 | 0.3895 | -21.2182 | 0.0241 | 0.001 |
No log | 3.0 | 6 | 0.7726 | 0.4321 | 0.4041 | -24.8224 | -0.0469 | 0.001 |
No log | 4.0 | 8 | 0.7917 | 0.4380 | 0.4095 | -26.5816 | -0.0667 | 0.001 |
No log | 5.0 | 10 | 0.7853 | 0.4318 | 0.4021 | -26.9559 | -0.1362 | 0.001 |
No log | 6.0 | 12 | 0.7648 | 0.4224 | 0.3905 | -24.4015 | -0.1297 | 0.001 |
No log | 7.0 | 14 | 0.7392 | 0.4103 | 0.3760 | -22.5579 | -0.1098 | 0.001 |
No log | 8.0 | 16 | 0.7115 | 0.3983 | 0.3639 | -20.0674 | -0.1054 | 0.0001 |
No log | 9.0 | 18 | 0.6897 | 0.3879 | 0.3535 | -18.1665 | -0.0925 | 0.0001 |
No log | 10.0 | 20 | 0.6777 | 0.3818 | 0.3468 | -16.9447 | -0.1029 | 0.0001 |
No log | 11.0 | 22 | 0.6702 | 0.3780 | 0.3424 | -16.0375 | -0.1169 | 0.0001 |
No log | 12.0 | 24 | 0.6639 | 0.3744 | 0.3389 | -15.6052 | -0.1121 | 0.0001 |
No log | 13.0 | 26 | 0.6565 | 0.3703 | 0.3346 | -14.8051 | -0.1065 | 0.0001 |
No log | 14.0 | 28 | 0.6501 | 0.3668 | 0.3310 | -14.2312 | -0.0958 | 0.0001 |
No log | 15.0 | 30 | 0.6468 | 0.3648 | 0.3289 | -14.0799 | -0.0855 | 0.0001 |
No log | 16.0 | 32 | 0.6471 | 0.3650 | 0.3289 | -14.2557 | -0.0823 | 0.0001 |
No log | 17.0 | 34 | 0.6435 | 0.3631 | 0.3268 | -14.0598 | -0.0810 | 0.0001 |
No log | 18.0 | 36 | 0.6438 | 0.3634 | 0.3270 | -14.0369 | -0.0799 | 0.0001 |
No log | 19.0 | 38 | 0.6400 | 0.3614 | 0.3250 | -13.8152 | -0.0888 | 0.0001 |
No log | 20.0 | 40 | 0.6392 | 0.3609 | 0.3246 | -13.7104 | -0.0935 | 0.0001 |
No log | 21.0 | 42 | 0.6387 | 0.3606 | 0.3246 | -13.8099 | -0.0993 | 0.0001 |
No log | 22.0 | 44 | 0.6388 | 0.3606 | 0.3243 | -13.8497 | -0.1056 | 0.0001 |
No log | 23.0 | 46 | 0.6362 | 0.3590 | 0.3228 | -13.5622 | -0.1035 | 0.0001 |
No log | 24.0 | 48 | 0.6354 | 0.3585 | 0.3223 | -13.6453 | -0.1058 | 0.0001 |
No log | 25.0 | 50 | 0.6345 | 0.3578 | 0.3214 | -13.6023 | -0.1036 | 0.0001 |
No log | 26.0 | 52 | 0.6349 | 0.3581 | 0.3212 | -13.6304 | -0.1173 | 0.0001 |
No log | 27.0 | 54 | 0.6333 | 0.3571 | 0.3201 | -13.5613 | -0.1148 | 0.0001 |
No log | 28.0 | 56 | 0.6295 | 0.3548 | 0.3177 | -13.2331 | -0.1083 | 0.0001 |
No log | 29.0 | 58 | 0.6285 | 0.3543 | 0.3173 | -13.1623 | -0.1047 | 0.0001 |
No log | 30.0 | 60 | 0.6263 | 0.3532 | 0.3163 | -12.7132 | -0.0926 | 0.0001 |
No log | 31.0 | 62 | 0.6273 | 0.3538 | 0.3167 | -12.8739 | -0.0893 | 0.0001 |
No log | 32.0 | 64 | 0.6294 | 0.3550 | 0.3181 | -12.9355 | -0.0790 | 0.0001 |
No log | 33.0 | 66 | 0.6299 | 0.3554 | 0.3185 | -12.9352 | -0.0752 | 0.0001 |
No log | 34.0 | 68 | 0.6321 | 0.3564 | 0.3193 | -13.2672 | -0.0702 | 0.0001 |
No log | 35.0 | 70 | 0.6279 | 0.3541 | 0.3175 | -12.9995 | -0.0487 | 0.0001 |
No log | 36.0 | 72 | 0.6280 | 0.3541 | 0.3174 | -13.0074 | -0.0466 | 0.0001 |
No log | 37.0 | 74 | 0.6304 | 0.3554 | 0.3187 | -13.2310 | -0.0494 | 1e-05 |
No log | 38.0 | 76 | 0.6297 | 0.3551 | 0.3183 | -12.9830 | -0.0439 | 1e-05 |
No log | 39.0 | 78 | 0.6308 | 0.3558 | 0.3193 | -13.1598 | -0.0430 | 1e-05 |
No log | 40.0 | 80 | 0.6292 | 0.3548 | 0.3183 | -13.0698 | -0.0435 | 1e-05 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1