File size: 8,819 Bytes
c0efc49
3ea82a1
 
c0efc49
 
 
 
 
 
 
3ea82a1
 
c0efc49
3ea82a1
c0efc49
3ea82a1
 
 
 
 
 
 
 
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
3ea82a1
c0efc49
 
3ea82a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: apache-2.0
base_model: facebook/dinov2-large
tags:
- generated_from_trainer
model-index:
- name: drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs

This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4061
- Rmse: 0.2019
- Mae: 0.1446
- Kl Divergence: 0.9802
- Explained Variance: 0.3860
- Learning Rate: 0.0000

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Rmse   | Mae    | Kl Divergence | Explained Variance | Rate   |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:-------------:|:------------------:|:------:|
| No log        | 1.0   | 438   | 0.4306          | 0.2210 | 0.1621 | 1.0069        | 0.2882             | 0.001  |
| 0.4808        | 2.0   | 876   | 0.4246          | 0.2179 | 0.1547 | 1.3119        | 0.3118             | 0.001  |
| 0.421         | 3.0   | 1314  | 0.4223          | 0.2158 | 0.1554 | 1.0982        | 0.3192             | 0.001  |
| 0.4151        | 4.0   | 1752  | 0.4191          | 0.2142 | 0.1552 | 1.0414        | 0.3351             | 0.001  |
| 0.4114        | 5.0   | 2190  | 0.4171          | 0.2123 | 0.1541 | 1.0698        | 0.3384             | 0.001  |
| 0.4089        | 6.0   | 2628  | 0.4209          | 0.2140 | 0.1520 | 1.1959        | 0.3311             | 0.001  |
| 0.4091        | 7.0   | 3066  | 0.4166          | 0.2126 | 0.1530 | 1.1709        | 0.3382             | 0.001  |
| 0.4071        | 8.0   | 3504  | 0.4195          | 0.2143 | 0.1556 | 0.9712        | 0.3346             | 0.001  |
| 0.4071        | 9.0   | 3942  | 0.4167          | 0.2121 | 0.1524 | 1.1432        | 0.3415             | 0.001  |
| 0.4062        | 10.0  | 4380  | 0.4186          | 0.2139 | 0.1535 | 0.9121        | 0.3420             | 0.001  |
| 0.4052        | 11.0  | 4818  | 0.4156          | 0.2114 | 0.1536 | 0.9950        | 0.3442             | 0.001  |
| 0.406         | 12.0  | 5256  | 0.4188          | 0.2139 | 0.1555 | 1.0106        | 0.3390             | 0.001  |
| 0.4058        | 13.0  | 5694  | 0.4163          | 0.2121 | 0.1553 | 1.1482        | 0.3425             | 0.001  |
| 0.4056        | 14.0  | 6132  | 0.4193          | 0.2138 | 0.1546 | 1.2111        | 0.3286             | 0.001  |
| 0.4033        | 15.0  | 6570  | 0.4162          | 0.2121 | 0.1542 | 1.2043        | 0.3402             | 0.001  |
| 0.4057        | 16.0  | 7008  | 0.4139          | 0.2102 | 0.1528 | 1.0828        | 0.3500             | 0.001  |
| 0.4057        | 17.0  | 7446  | 0.4171          | 0.2118 | 0.1564 | 1.0006        | 0.3430             | 0.001  |
| 0.405         | 18.0  | 7884  | 0.4146          | 0.2107 | 0.1507 | 1.0514        | 0.3499             | 0.001  |
| 0.4035        | 19.0  | 8322  | 0.4186          | 0.2114 | 0.1532 | 0.9575        | 0.3468             | 0.001  |
| 0.4031        | 20.0  | 8760  | 0.4143          | 0.2108 | 0.1513 | 1.1648        | 0.3487             | 0.001  |
| 0.4048        | 21.0  | 9198  | 0.4195          | 0.2123 | 0.1533 | 1.2950        | 0.3385             | 0.001  |
| 0.4055        | 22.0  | 9636  | 0.4340          | 0.2110 | 0.1524 | inf           | 0.3463             | 0.001  |
| 0.4022        | 23.0  | 10074 | 0.4327          | 0.2085 | 0.1517 | nan           | 0.3621             | 0.0001 |
| 0.3978        | 24.0  | 10512 | 0.4385          | 0.2092 | 0.1493 | nan           | 0.3583             | 0.0001 |
| 0.3978        | 25.0  | 10950 | 0.4272          | 0.2074 | 0.1490 | inf           | 0.3649             | 0.0001 |
| 0.3988        | 26.0  | 11388 | 0.4105          | 0.2075 | 0.1480 | 1.1903        | 0.3644             | 0.0001 |
| 0.3958        | 27.0  | 11826 | 0.4096          | 0.2067 | 0.1494 | 0.9915        | 0.3688             | 0.0001 |
| 0.3965        | 28.0  | 12264 | 0.4104          | 0.2075 | 0.1493 | 0.9669        | 0.3681             | 0.0001 |
| 0.396         | 29.0  | 12702 | 0.4097          | 0.2069 | 0.1469 | 1.0433        | 0.3696             | 0.0001 |
| 0.3936        | 30.0  | 13140 | 0.4094          | 0.2065 | 0.1490 | 0.9082        | 0.3731             | 0.0001 |
| 0.3944        | 31.0  | 13578 | 0.4091          | 0.2065 | 0.1470 | 1.0120        | 0.3705             | 0.0001 |
| 0.3941        | 32.0  | 14016 | 0.4084          | 0.2060 | 0.1483 | 0.9708        | 0.3742             | 0.0001 |
| 0.3941        | 33.0  | 14454 | 0.4082          | 0.2057 | 0.1474 | 0.9317        | 0.3755             | 0.0001 |
| 0.3933        | 34.0  | 14892 | 0.4085          | 0.2061 | 0.1481 | 0.9619        | 0.3747             | 0.0001 |
| 0.3926        | 35.0  | 15330 | 0.4073          | 0.2054 | 0.1466 | 1.0523        | 0.3758             | 0.0001 |
| 0.3936        | 36.0  | 15768 | 0.4074          | 0.2052 | 0.1460 | 1.0622        | 0.3771             | 0.0001 |
| 0.3935        | 37.0  | 16206 | 0.4066          | 0.2047 | 0.1456 | 1.0201        | 0.3802             | 0.0001 |
| 0.3927        | 38.0  | 16644 | 0.4064          | 0.2045 | 0.1459 | 1.0557        | 0.3800             | 0.0001 |
| 0.392         | 39.0  | 17082 | 0.4078          | 0.2056 | 0.1469 | 1.0055        | 0.3771             | 0.0001 |
| 0.3915        | 40.0  | 17520 | 0.4068          | 0.2049 | 0.1464 | 0.9849        | 0.3805             | 0.0001 |
| 0.3915        | 41.0  | 17958 | 0.4089          | 0.2063 | 0.1489 | 0.8999        | 0.3778             | 0.0001 |
| 0.3907        | 42.0  | 18396 | 0.4069          | 0.2049 | 0.1463 | 1.0617        | 0.3797             | 0.0001 |
| 0.3919        | 43.0  | 18834 | 0.4058          | 0.2041 | 0.1450 | 1.0520        | 0.3830             | 0.0001 |
| 0.3902        | 44.0  | 19272 | 0.4071          | 0.2050 | 0.1475 | 1.0054        | 0.3809             | 0.0001 |
| 0.3896        | 45.0  | 19710 | 0.4067          | 0.2047 | 0.1440 | 1.1386        | 0.3813             | 0.0001 |
| 0.3925        | 46.0  | 20148 | 0.4067          | 0.2047 | 0.1457 | 1.0253        | 0.3831             | 0.0001 |
| 0.3896        | 47.0  | 20586 | 0.4062          | 0.2043 | 0.1473 | 1.0430        | 0.3834             | 0.0001 |
| 0.3902        | 48.0  | 21024 | 0.4065          | 0.2048 | 0.1457 | 1.1041        | 0.3812             | 0.0001 |
| 0.3902        | 49.0  | 21462 | 0.4071          | 0.2052 | 0.1463 | 1.0702        | 0.3798             | 0.0001 |
| 0.3897        | 50.0  | 21900 | 0.4064          | 0.2042 | 0.1479 | 0.8917        | 0.3857             | 1e-05  |
| 0.3875        | 51.0  | 22338 | 0.4058          | 0.2041 | 0.1437 | 0.9960        | 0.3845             | 1e-05  |
| 0.3874        | 52.0  | 22776 | 0.4053          | 0.2037 | 0.1446 | 1.0567        | 0.3851             | 1e-05  |
| 0.3899        | 53.0  | 23214 | 0.4056          | 0.2039 | 0.1462 | 1.0205        | 0.3859             | 1e-05  |
| 0.3892        | 54.0  | 23652 | 0.4059          | 0.2041 | 0.1441 | 0.9905        | 0.3854             | 1e-05  |
| 0.3892        | 55.0  | 24090 | 0.4061          | 0.2041 | 0.1471 | 0.9379        | 0.3856             | 1e-05  |
| 0.3869        | 56.0  | 24528 | 0.4059          | 0.2041 | 0.1454 | 0.9696        | 0.3854             | 1e-05  |
| 0.3869        | 57.0  | 24966 | 0.4058          | 0.2041 | 0.1460 | 1.0591        | 0.3842             | 1e-05  |
| 0.3874        | 58.0  | 25404 | 0.4063          | 0.2043 | 0.1460 | 0.9276        | 0.3860             | 1e-05  |
| 0.3887        | 59.0  | 25842 | 0.4056          | 0.2038 | 0.1453 | 0.9794        | 0.3868             | 0.0000 |
| 0.3882        | 60.0  | 26280 | 0.4057          | 0.2040 | 0.1446 | 1.0349        | 0.3851             | 0.0000 |
| 0.389         | 61.0  | 26718 | 0.4058          | 0.2041 | 0.1449 | 0.9860        | 0.3857             | 0.0000 |
| 0.3882        | 62.0  | 27156 | 0.4054          | 0.2037 | 0.1446 | 0.9528        | 0.3865             | 0.0000 |


### Framework versions

- Transformers 4.41.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.2
- Tokenizers 0.19.1