lombardata commited on
Commit
3995206
1 Parent(s): 64207f8

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +174 -99
README.md CHANGED
@@ -1,123 +1,198 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-base
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: dinov2-base-2024_09_09-batch-size32_epochs150_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
 
16
- # dinov2-base-2024_09_09-batch-size32_epochs150_freeze
17
-
18
- This model is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.1321
21
  - F1 Micro: 0.8069
22
  - F1 Macro: 0.7121
23
  - Roc Auc: 0.8742
24
  - Accuracy: 0.2869
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
32
 
33
- More information needed
 
 
 
 
 
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
- | No log | 1.0 | 273 | 0.1601 | 0.7634 | 0.6251 | 0.8453 | 0.2328 | 0.001 |
58
- | 0.1759 | 2.0 | 546 | 0.1504 | 0.7780 | 0.6462 | 0.8546 | 0.2498 | 0.001 |
59
- | 0.1759 | 3.0 | 819 | 0.1483 | 0.7817 | 0.6644 | 0.8583 | 0.2564 | 0.001 |
60
- | 0.1474 | 4.0 | 1092 | 0.1464 | 0.7863 | 0.6809 | 0.8634 | 0.2554 | 0.001 |
61
- | 0.1474 | 5.0 | 1365 | 0.1423 | 0.7891 | 0.6919 | 0.8572 | 0.2682 | 0.001 |
62
- | 0.1397 | 6.0 | 1638 | 0.1440 | 0.7902 | 0.6988 | 0.8629 | 0.2651 | 0.001 |
63
- | 0.1397 | 7.0 | 1911 | 0.1425 | 0.7938 | 0.6850 | 0.8647 | 0.2682 | 0.001 |
64
- | 0.1356 | 8.0 | 2184 | 0.1429 | 0.7931 | 0.6880 | 0.8700 | 0.2637 | 0.001 |
65
- | 0.1356 | 9.0 | 2457 | 0.1463 | 0.7927 | 0.6885 | 0.8704 | 0.2557 | 0.001 |
66
- | 0.1315 | 10.0 | 2730 | 0.1392 | 0.8009 | 0.7050 | 0.8729 | 0.2744 | 0.001 |
67
- | 0.1308 | 11.0 | 3003 | 0.1443 | 0.7853 | 0.6892 | 0.8519 | 0.2699 | 0.001 |
68
- | 0.1308 | 12.0 | 3276 | 0.1452 | 0.7888 | 0.6976 | 0.8670 | 0.2713 | 0.001 |
69
- | 0.1277 | 13.0 | 3549 | 0.1370 | 0.8007 | 0.7032 | 0.8680 | 0.2765 | 0.001 |
70
- | 0.1277 | 14.0 | 3822 | 0.1401 | 0.7984 | 0.6875 | 0.8694 | 0.2730 | 0.001 |
71
- | 0.1257 | 15.0 | 4095 | 0.1379 | 0.8049 | 0.7001 | 0.8748 | 0.2817 | 0.001 |
72
- | 0.1257 | 16.0 | 4368 | 0.1429 | 0.7969 | 0.7063 | 0.8675 | 0.2682 | 0.001 |
73
- | 0.1257 | 17.0 | 4641 | 0.1451 | 0.7956 | 0.6861 | 0.8728 | 0.2613 | 0.001 |
74
- | 0.1257 | 18.0 | 4914 | 0.1418 | 0.7906 | 0.6849 | 0.8574 | 0.2713 | 0.001 |
75
- | 0.1251 | 19.0 | 5187 | 0.1438 | 0.7900 | 0.6794 | 0.8556 | 0.2654 | 0.001 |
76
- | 0.1251 | 20.0 | 5460 | 0.1319 | 0.8068 | 0.7202 | 0.8705 | 0.2866 | 0.0001 |
77
- | 0.1161 | 21.0 | 5733 | 0.1312 | 0.8081 | 0.7237 | 0.8715 | 0.2876 | 0.0001 |
78
- | 0.1109 | 22.0 | 6006 | 0.1310 | 0.8101 | 0.7222 | 0.8788 | 0.2935 | 0.0001 |
79
- | 0.1109 | 23.0 | 6279 | 0.1305 | 0.8120 | 0.7226 | 0.8776 | 0.2935 | 0.0001 |
80
- | 0.1103 | 24.0 | 6552 | 0.1309 | 0.8096 | 0.7238 | 0.8769 | 0.2952 | 0.0001 |
81
- | 0.1103 | 25.0 | 6825 | 0.1308 | 0.8093 | 0.7171 | 0.8735 | 0.2949 | 0.0001 |
82
- | 0.1099 | 26.0 | 7098 | 0.1301 | 0.8100 | 0.7200 | 0.8745 | 0.2911 | 0.0001 |
83
- | 0.1099 | 27.0 | 7371 | 0.1303 | 0.8082 | 0.7208 | 0.8727 | 0.2924 | 0.0001 |
84
- | 0.1107 | 28.0 | 7644 | 0.1302 | 0.8103 | 0.7218 | 0.8752 | 0.2970 | 0.0001 |
85
- | 0.1107 | 29.0 | 7917 | 0.1302 | 0.8104 | 0.7237 | 0.8766 | 0.2963 | 0.0001 |
86
- | 0.1103 | 30.0 | 8190 | 0.1303 | 0.8097 | 0.7181 | 0.8745 | 0.2956 | 0.0001 |
87
- | 0.1103 | 31.0 | 8463 | 0.1301 | 0.8092 | 0.7190 | 0.8739 | 0.2959 | 0.0001 |
88
- | 0.1104 | 32.0 | 8736 | 0.1301 | 0.8098 | 0.7210 | 0.8740 | 0.2928 | 0.0001 |
89
- | 0.1093 | 33.0 | 9009 | 0.1296 | 0.8100 | 0.7204 | 0.8738 | 0.2963 | 1e-05 |
90
- | 0.1093 | 34.0 | 9282 | 0.1296 | 0.8101 | 0.7222 | 0.8745 | 0.2956 | 1e-05 |
91
- | 0.1084 | 35.0 | 9555 | 0.1295 | 0.8109 | 0.7220 | 0.8758 | 0.2956 | 1e-05 |
92
- | 0.1084 | 36.0 | 9828 | 0.1295 | 0.8105 | 0.7212 | 0.8746 | 0.2931 | 1e-05 |
93
- | 0.1091 | 37.0 | 10101 | 0.1295 | 0.8119 | 0.7239 | 0.8757 | 0.2963 | 1e-05 |
94
- | 0.1091 | 38.0 | 10374 | 0.1295 | 0.8104 | 0.7213 | 0.8744 | 0.2959 | 1e-05 |
95
- | 0.1075 | 39.0 | 10647 | 0.1295 | 0.8106 | 0.7222 | 0.8752 | 0.2966 | 1e-05 |
96
- | 0.1075 | 40.0 | 10920 | 0.1295 | 0.8113 | 0.7233 | 0.8768 | 0.2956 | 1e-05 |
97
- | 0.1088 | 41.0 | 11193 | 0.1295 | 0.8100 | 0.7223 | 0.8739 | 0.2945 | 1e-05 |
98
- | 0.1088 | 42.0 | 11466 | 0.1295 | 0.8111 | 0.7219 | 0.8750 | 0.2973 | 1e-05 |
99
- | 0.1085 | 43.0 | 11739 | 0.1294 | 0.8098 | 0.7212 | 0.8738 | 0.2931 | 1e-05 |
100
- | 0.1084 | 44.0 | 12012 | 0.1295 | 0.8108 | 0.7212 | 0.8746 | 0.2970 | 1e-05 |
101
- | 0.1084 | 45.0 | 12285 | 0.1294 | 0.8104 | 0.7218 | 0.8749 | 0.2945 | 1e-05 |
102
- | 0.1083 | 46.0 | 12558 | 0.1294 | 0.8113 | 0.7233 | 0.8759 | 0.2976 | 1e-05 |
103
- | 0.1083 | 47.0 | 12831 | 0.1294 | 0.8107 | 0.7229 | 0.8753 | 0.2945 | 1e-05 |
104
- | 0.109 | 48.0 | 13104 | 0.1294 | 0.8103 | 0.7209 | 0.8742 | 0.2956 | 1e-05 |
105
- | 0.109 | 49.0 | 13377 | 0.1293 | 0.8111 | 0.7215 | 0.8755 | 0.2959 | 1e-05 |
106
- | 0.108 | 50.0 | 13650 | 0.1294 | 0.8107 | 0.7211 | 0.8750 | 0.2966 | 1e-05 |
107
- | 0.108 | 51.0 | 13923 | 0.1294 | 0.8099 | 0.7224 | 0.8742 | 0.2924 | 1e-05 |
108
- | 0.1084 | 52.0 | 14196 | 0.1294 | 0.8110 | 0.7224 | 0.8755 | 0.2973 | 1e-05 |
109
- | 0.1084 | 53.0 | 14469 | 0.1295 | 0.8111 | 0.7225 | 0.8757 | 0.2980 | 1e-05 |
110
- | 0.1086 | 54.0 | 14742 | 0.1294 | 0.8105 | 0.7222 | 0.8752 | 0.2963 | 1e-05 |
111
- | 0.1083 | 55.0 | 15015 | 0.1293 | 0.8107 | 0.7231 | 0.8754 | 0.2956 | 1e-05 |
112
- | 0.1083 | 56.0 | 15288 | 0.1294 | 0.8107 | 0.7227 | 0.8753 | 0.2959 | 0.0000 |
113
- | 0.108 | 57.0 | 15561 | 0.1293 | 0.8111 | 0.7231 | 0.8754 | 0.2956 | 0.0000 |
114
- | 0.108 | 58.0 | 15834 | 0.1294 | 0.8112 | 0.7230 | 0.8755 | 0.2966 | 0.0000 |
115
- | 0.1089 | 59.0 | 16107 | 0.1294 | 0.8110 | 0.7227 | 0.8753 | 0.2966 | 0.0000 |
116
-
117
-
118
- ### Framework versions
119
-
120
- - Transformers 4.41.1
121
- - Pytorch 2.3.0+cu121
122
- - Datasets 2.19.1
123
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-base
 
11
  model-index:
12
  - name: dinov2-base-2024_09_09-batch-size32_epochs150_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVd'eau is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base). It achieves the following results on the test set:
 
17
 
 
 
 
 
18
  - Loss: 0.1321
19
  - F1 Micro: 0.8069
20
  - F1 Macro: 0.7121
21
  - Roc Auc: 0.8742
22
  - Accuracy: 0.2869
 
23
 
24
+ ---
25
+
26
+ # Model description
27
+ DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
+
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the number of images for each class are given in the following table:
42
+ | Class | train | val | test | Total |
43
+ |:-------------------------|--------:|------:|-------:|--------:|
44
+ | Acropore_branched | 1469 | 464 | 475 | 2408 |
45
+ | Acropore_digitised | 568 | 160 | 160 | 888 |
46
+ | Acropore_sub_massive | 150 | 50 | 43 | 243 |
47
+ | Acropore_tabular | 999 | 297 | 293 | 1589 |
48
+ | Algae_assembly | 2546 | 847 | 845 | 4238 |
49
+ | Algae_drawn_up | 367 | 126 | 127 | 620 |
50
+ | Algae_limestone | 1652 | 557 | 563 | 2772 |
51
+ | Algae_sodding | 3148 | 984 | 985 | 5117 |
52
+ | Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
53
+ | Bleached_coral | 219 | 71 | 70 | 360 |
54
+ | Blurred | 191 | 67 | 62 | 320 |
55
+ | Dead_coral | 1979 | 642 | 643 | 3264 |
56
+ | Fish | 2018 | 656 | 647 | 3321 |
57
+ | Homo_sapiens | 161 | 62 | 59 | 282 |
58
+ | Human_object | 157 | 58 | 55 | 270 |
59
+ | Living_coral | 406 | 154 | 141 | 701 |
60
+ | Millepore | 385 | 127 | 125 | 637 |
61
+ | No_acropore_encrusting | 441 | 130 | 154 | 725 |
62
+ | No_acropore_foliaceous | 204 | 36 | 46 | 286 |
63
+ | No_acropore_massive | 1031 | 336 | 338 | 1705 |
64
+ | No_acropore_solitary | 202 | 53 | 48 | 303 |
65
+ | No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
66
+ | Rock | 4489 | 1495 | 1473 | 7457 |
67
+ | Rubble | 3092 | 1030 | 1001 | 5123 |
68
+ | Sand | 5842 | 1939 | 1938 | 9719 |
69
+ | Sea_cucumber | 1408 | 439 | 447 | 2294 |
70
+ | Sea_urchins | 327 | 107 | 111 | 545 |
71
+ | Sponge | 269 | 96 | 105 | 470 |
72
+ | Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
73
+ | Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
74
+ | Useless | 579 | 193 | 193 | 965 |
75
 
76
+ ---
77
 
78
+ # Training procedure
79
 
80
+ ## Training hyperparameters
81
 
82
  The following hyperparameters were used during training:
83
+
84
+ - **Number of Epochs**: 150
85
+ - **Learning Rate**: 0.001
86
+ - **Train Batch Size**: 32
87
+ - **Eval Batch Size**: 32
88
+ - **Optimizer**: Adam
89
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
90
+ - **Freeze Encoder**: Yes
91
+ - **Data Augmentation**: Yes
92
+
93
+
94
+ ## Data Augmentation
95
+ Data were augmented using the following transformations :
96
+
97
+ Train Transforms
98
+ - **PreProcess**: No additional parameters
99
+ - **Resize**: probability=1.00
100
+ - **RandomHorizontalFlip**: probability=0.25
101
+ - **RandomVerticalFlip**: probability=0.25
102
+ - **ColorJiggle**: probability=0.25
103
+ - **RandomPerspective**: probability=0.25
104
+ - **Normalize**: probability=1.00
105
+
106
+ Val Transforms
107
+ - **PreProcess**: No additional parameters
108
+ - **Resize**: probability=1.00
109
+ - **Normalize**: probability=1.00
110
+
111
+
112
+
113
+ ## Training results
114
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
115
+ --- | --- | --- | --- | --- | ---
116
+ 1 | 0.16006726026535034 | 0.23284823284823286 | 0.7633800438966739 | 0.6250897780499145 | 0.001
117
+ 2 | 0.150440976023674 | 0.24982674982674982 | 0.7780064686856808 | 0.646165211379598 | 0.001
118
+ 3 | 0.14829224348068237 | 0.2564102564102564 | 0.7816936696175046 | 0.6644318154557648 | 0.001
119
+ 4 | 0.14641565084457397 | 0.2553707553707554 | 0.7862639635912287 | 0.680888104485521 | 0.001
120
+ 5 | 0.14226503670215607 | 0.2681912681912682 | 0.7891243298442687 | 0.6919100708566497 | 0.001
121
+ 6 | 0.1439608633518219 | 0.26507276507276506 | 0.7901946045268521 | 0.6987715680115144 | 0.001
122
+ 7 | 0.1425073742866516 | 0.2681912681912682 | 0.7937821236053655 | 0.6849790066180481 | 0.001
123
+ 8 | 0.14294348657131195 | 0.2636867636867637 | 0.793083667950504 | 0.6880365824342907 | 0.001
124
+ 9 | 0.14630228281021118 | 0.25571725571725573 | 0.7926595005517636 | 0.6884565577441364 | 0.001
125
+ 10 | 0.13922064006328583 | 0.27442827442827444 | 0.8009224940284985 | 0.7049759390767861 | 0.001
126
+ 11 | 0.14429208636283875 | 0.26992376992376993 | 0.785345272946444 | 0.6892328865834217 | 0.001
127
+ 12 | 0.14520499110221863 | 0.2713097713097713 | 0.7888341543513957 | 0.6976448599197044 | 0.001
128
+ 13 | 0.13695523142814636 | 0.2765072765072765 | 0.8007200870802982 | 0.7032121010324246 | 0.001
129
+ 14 | 0.14012356102466583 | 0.273042273042273 | 0.7983576642335767 | 0.6875097222118577 | 0.001
130
+ 15 | 0.13785772025585175 | 0.2817047817047817 | 0.8048810652595126 | 0.7001361694791496 | 0.001
131
+ 16 | 0.1429404616355896 | 0.2681912681912682 | 0.7968854097268487 | 0.7063273106998997 | 0.001
132
+ 17 | 0.1451471894979477 | 0.26126126126126126 | 0.7956287718153646 | 0.6860743816280108 | 0.001
133
+ 18 | 0.141770601272583 | 0.2713097713097713 | 0.7906203368151778 | 0.6849355289660601 | 0.001
134
+ 19 | 0.14384245872497559 | 0.2654192654192654 | 0.7899699957136733 | 0.6794374521554336 | 0.001
135
+ 20 | 0.13193023204803467 | 0.28655578655578656 | 0.8068363147728227 | 0.7201978132992005 | 0.0001
136
+ 21 | 0.13121400773525238 | 0.2875952875952876 | 0.8080536912751679 | 0.7236910659256566 | 0.0001
137
+ 22 | 0.1310088187456131 | 0.2934857934857935 | 0.810120343368793 | 0.7222147145142929 | 0.0001
138
+ 23 | 0.1304517388343811 | 0.2934857934857935 | 0.8120394137616957 | 0.7226400439644629 | 0.0001
139
+ 24 | 0.13093852996826172 | 0.29521829521829523 | 0.8096162584162916 | 0.7237916982943077 | 0.0001
140
+ 25 | 0.13081994652748108 | 0.2948717948717949 | 0.8093388464269307 | 0.7170657451815683 | 0.0001
141
+ 26 | 0.13007444143295288 | 0.2910602910602911 | 0.8099862459884133 | 0.7200172245050901 | 0.0001
142
+ 27 | 0.13034380972385406 | 0.29244629244629244 | 0.8082065853250877 | 0.7207907434740295 | 0.0001
143
+ 28 | 0.13018907606601715 | 0.29695079695079696 | 0.810349848163401 | 0.7217805682073449 | 0.0001
144
+ 29 | 0.13019531965255737 | 0.29625779625779625 | 0.8104190823256585 | 0.723719101965087 | 0.0001
145
+ 30 | 0.13030356168746948 | 0.2955647955647956 | 0.8096606287736832 | 0.718144679800513 | 0.0001
146
+ 31 | 0.1301266849040985 | 0.2959112959112959 | 0.8092418049879057 | 0.7189603352791966 | 0.0001
147
+ 32 | 0.1301257312297821 | 0.2927927927927928 | 0.8097980303789017 | 0.7210148516296496 | 0.0001
148
+ 33 | 0.12959885597229004 | 0.29625779625779625 | 0.8099594769603543 | 0.7204264964359948 | 1e-05
149
+ 34 | 0.12959957122802734 | 0.2955647955647956 | 0.8100854344655136 | 0.722168676552786 | 1e-05
150
+ 35 | 0.12954092025756836 | 0.2955647955647956 | 0.8108894430590192 | 0.7220033007887567 | 1e-05
151
+ 36 | 0.12953610718250275 | 0.29313929313929316 | 0.8104569713142095 | 0.7211650841899886 | 1e-05
152
+ 37 | 0.1295497566461563 | 0.29625779625779625 | 0.8118778893007372 | 0.7239071903959954 | 1e-05
153
+ 38 | 0.12949061393737793 | 0.2959112959112959 | 0.8104318798247445 | 0.7212977755433345 | 1e-05
154
+ 39 | 0.12945865094661713 | 0.2966042966042966 | 0.8106218263547823 | 0.7221707642640621 | 1e-05
155
+ 40 | 0.12946291267871857 | 0.2955647955647956 | 0.8113418729013804 | 0.7232749192333074 | 1e-05
156
+ 41 | 0.1294611394405365 | 0.2945252945252945 | 0.8100071001962995 | 0.722313917509489 | 1e-05
157
+ 42 | 0.12951640784740448 | 0.2972972972972973 | 0.8111398315684148 | 0.7219276596712088 | 1e-05
158
+ 43 | 0.12940654158592224 | 0.29313929313929316 | 0.8097862391449566 | 0.7212066160587719 | 1e-05
159
+ 44 | 0.12948854267597198 | 0.29695079695079696 | 0.8108311081441923 | 0.7211905265653523 | 1e-05
160
+ 45 | 0.12943118810653687 | 0.2945252945252945 | 0.8103943697164036 | 0.7217673828508766 | 1e-05
161
+ 46 | 0.12941767275333405 | 0.29764379764379767 | 0.8113435070065285 | 0.7232663108413819 | 1e-05
162
+ 47 | 0.12936843931674957 | 0.2945252945252945 | 0.8107185952648442 | 0.7229077354567445 | 1e-05
163
+ 48 | 0.12944123148918152 | 0.2955647955647956 | 0.8102512730611904 | 0.7208766406208041 | 1e-05
164
+ 49 | 0.12932655215263367 | 0.2959112959112959 | 0.8111032502392942 | 0.7215165769975259 | 1e-05
165
+ 50 | 0.1294257938861847 | 0.2966042966042966 | 0.8106959890041235 | 0.7210862927892402 | 1e-05
166
+ 51 | 0.12937645614147186 | 0.29244629244629244 | 0.8098573930447837 | 0.7224236625273444 | 1e-05
167
+ 52 | 0.12941104173660278 | 0.2972972972972973 | 0.8110019973368842 | 0.7223932851056244 | 1e-05
168
+ 53 | 0.12947481870651245 | 0.29799029799029797 | 0.8110783049860689 | 0.7225026360610024 | 1e-05
169
+ 54 | 0.12942463159561157 | 0.29625779625779625 | 0.8104531646623112 | 0.7221711249170111 | 1e-05
170
+ 55 | 0.12934881448745728 | 0.2955647955647956 | 0.8107163657542226 | 0.7231230527181782 | 1e-05
171
+ 56 | 0.12935101985931396 | 0.2959112959112959 | 0.810738813735692 | 0.7226955143721722 | 1.0000000000000002e-06
172
+ 57 | 0.12934598326683044 | 0.2955647955647956 | 0.8110560712650376 | 0.7230703100391168 | 1.0000000000000002e-06
173
+ 58 | 0.1293543428182602 | 0.2966042966042966 | 0.8112406328059951 | 0.7230017316798248 | 1.0000000000000002e-06
174
+ 59 | 0.12936049699783325 | 0.2966042966042966 | 0.8110088687179914 | 0.7227156089091311 | 1.0000000000000002e-06
175
+
176
+
177
+ ---
178
+
179
+ # CO2 Emissions
180
+
181
+ The estimated CO2 emissions for training this model are documented below:
182
+
183
+ - **Emissions**: 0.7291228651023076 grams of CO2
184
+ - **Source**: Code Carbon
185
+ - **Training Type**: fine-tuning
186
+ - **Geographical Location**: Brest, France
187
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
188
+
189
+
190
+ ---
191
+
192
+ # Framework Versions
193
+
194
+ - **Transformers**: 4.41.1
195
+ - **Pytorch**: 2.3.0+cu121
196
+ - **Datasets**: 2.19.1
197
+ - **Tokenizers**: 0.19.1
198
+