DinoVdeau-large-2024_04_03-with_data_aug_batch-size32_epochs150_freeze
DinoVd'eau is a fine-tuned version of facebook/dinov2-large on the multilabel_complete_dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.1181
- F1 Micro: 0.8219
- F1 Macro: 0.7131
- Roc Auc: 0.8797
- Accuracy: 0.3214
- Learning Rate: 0.0000
Model description
DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
- Developed by: lombardata, credits to César Leblanc and Victor Illien
Intended uses & limitations
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
Training and evaluation data
Details on the number of images for each class are given in the following table:
train | val | test | Total | |
---|---|---|---|---|
Acropore_branched | 1504 | 445 | 430 | 2379 |
Acropore_digitised | 593 | 151 | 144 | 888 |
Acropore_sub_massive | 148 | 54 | 41 | 243 |
Acropore_tabular | 1012 | 290 | 287 | 1589 |
Algae_assembly | 2545 | 858 | 835 | 4238 |
Algae_drawn_up | 376 | 123 | 121 | 620 |
Algae_limestone | 1652 | 561 | 559 | 2772 |
Algae_sodding | 3094 | 1011 | 1012 | 5117 |
Atra/Leucospilota | 1081 | 352 | 359 | 1792 |
Bleached_coral | 220 | 70 | 70 | 360 |
Blurred | 192 | 62 | 66 | 320 |
Dead_coral | 2001 | 637 | 626 | 3264 |
Fish | 2068 | 611 | 642 | 3321 |
Homo_sapiens | 162 | 60 | 60 | 282 |
Human_object | 157 | 60 | 53 | 270 |
Living_coral | 147 | 56 | 47 | 250 |
Millepore | 378 | 131 | 128 | 637 |
No_acropore_encrusting | 422 | 152 | 151 | 725 |
No_acropore_foliaceous | 200 | 46 | 40 | 286 |
No_acropore_massive | 1033 | 337 | 335 | 1705 |
No_acropore_solitary | 193 | 56 | 54 | 303 |
No_acropore_sub_massive | 1412 | 418 | 426 | 2256 |
Rock | 4487 | 1481 | 1489 | 7457 |
Sand | 5806 | 1959 | 1954 | 9719 |
Rubble | 3063 | 1030 | 1030 | 5123 |
Sea_cucumber | 1396 | 453 | 445 | 2294 |
Sea_urchins | 319 | 122 | 104 | 545 |
Sponge | 273 | 107 | 90 | 470 |
Syringodium_isoetifolium | 1198 | 399 | 398 | 1995 |
Thalassodendron_ciliatum | 781 | 260 | 262 | 1303 |
Useless | 579 | 193 | 193 | 965 |
Training procedure
Data Augmentation
Data were augmented using the following transformations :
- training transformations : Sequential( (0): PreProcess() (1): Resize(output_size=(518, 518), p=1.0, p_batch=1.0, same_on_batch=True, size=(518, 518), side=short, resample=bilinear, align_corners=True, antialias=False) (2): RandomHorizontalFlip(p=0.25, p_batch=1.0, same_on_batch=False) (3): RandomVerticalFlip(p=0.25, p_batch=1.0, same_on_batch=False) (4): ColorJiggle(brightness=0.0, contrast=0.0, saturation=0.0, hue=0.0, p=0.25, p_batch=1.0, same_on_batch=False) (5): RandomPerspective(distortion_scale=0.5, p=0.25, p_batch=1.0, same_on_batch=False, align_corners=False, resample=bilinear) (6): Normalize(p=1.0, p_batch=1.0, same_on_batch=True, mean=tensor([0.4850, 0.4560, 0.4060]), std=tensor([0.2290, 0.2240, 0.2250])) )
- validation transformations : Sequential( (0): PreProcess() (1): Resize(output_size=(518, 518), p=1.0, p_batch=1.0, same_on_batch=True, size=(518, 518), side=short, resample=bilinear, align_corners=True, antialias=False) (2): Normalize(p=1.0, p_batch=1.0, same_on_batch=True, mean=tensor([0.4850, 0.4560, 0.4060]), std=tensor([0.2290, 0.2240, 0.2250])) )
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
- freeze_encoder: True
- num_epochs: 150
Training results
Training Loss | Epoch | Step | Accuracy | F1 Macro | F1 Micro | Validation Loss | Roc Auc | Rate |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 271 | 0.2207 | 0.4907 | 0.7369 | 0.1679 | 0.8188 | 0.001 |
0.2713 | 2.0 | 542 | 0.2515 | 0.5389 | 0.7614 | 0.1540 | 0.8356 | 0.001 |
0.2713 | 3.0 | 813 | 0.2526 | 0.6054 | 0.7728 | 0.1477 | 0.8472 | 0.001 |
0.1679 | 4.0 | 1084 | 0.2594 | 0.5848 | 0.7755 | 0.1578 | 0.8442 | 0.001 |
0.1679 | 5.0 | 1355 | 0.2618 | 0.6125 | 0.7819 | 0.1426 | 0.8555 | 0.001 |
0.1598 | 6.0 | 1626 | 0.2550 | 0.6239 | 0.7822 | 0.1422 | 0.8542 | 0.001 |
0.1598 | 7.0 | 1897 | 0.2557 | 0.6320 | 0.7825 | 0.1426 | 0.8534 | 0.001 |
0.1571 | 8.0 | 2168 | 0.2629 | 0.6223 | 0.7756 | 0.1528 | 0.8437 | 0.001 |
0.1571 | 9.0 | 2439 | 0.2481 | 0.6413 | 0.7796 | 0.1438 | 0.8549 | 0.001 |
0.1554 | 10.0 | 2710 | 0.2697 | 0.6289 | 0.7889 | 0.1405 | 0.8621 | 0.001 |
0.1554 | 11.0 | 2981 | 0.2684 | 0.6222 | 0.7898 | 0.1409 | 0.8614 | 0.001 |
0.1536 | 12.0 | 3252 | 0.2725 | 0.6166 | 0.7863 | 0.1392 | 0.8528 | 0.001 |
0.1526 | 13.0 | 3523 | 0.2625 | 0.6419 | 0.7877 | 0.1399 | 0.8559 | 0.001 |
0.1526 | 14.0 | 3794 | 0.2649 | 0.6326 | 0.7860 | 0.1438 | 0.8609 | 0.001 |
0.1535 | 15.0 | 4065 | 0.2735 | 0.6499 | 0.7930 | 0.1377 | 0.8625 | 0.001 |
0.1535 | 16.0 | 4336 | 0.2677 | 0.6435 | 0.7868 | 0.1397 | 0.8526 | 0.001 |
0.1517 | 17.0 | 4607 | 0.2646 | 0.6401 | 0.7928 | 0.1382 | 0.8634 | 0.001 |
0.1517 | 18.0 | 4878 | 0.2684 | 0.6286 | 0.7912 | 0.1392 | 0.8624 | 0.001 |
0.1524 | 19.0 | 5149 | 0.2636 | 0.6183 | 0.7874 | 0.1392 | 0.8576 | 0.001 |
0.1524 | 20.0 | 5420 | 0.2598 | 0.6286 | 0.7878 | 0.1386 | 0.8578 | 0.001 |
0.1527 | 21.0 | 5691 | 0.2601 | 0.6408 | 0.7880 | 0.1374 | 0.8557 | 0.001 |
0.1527 | 22.0 | 5962 | 0.2704 | 0.6476 | 0.7897 | 0.1377 | 0.8577 | 0.001 |
0.1513 | 23.0 | 6233 | 0.2697 | 0.6443 | 0.7955 | 0.1373 | 0.8655 | 0.001 |
0.1514 | 24.0 | 6504 | 0.2656 | 0.6477 | 0.7877 | 0.1593 | 0.8547 | 0.001 |
0.1514 | 25.0 | 6775 | 0.2656 | 0.6477 | 0.7909 | 0.1371 | 0.8619 | 0.001 |
0.1513 | 26.0 | 7046 | 0.2666 | 0.6273 | 0.7871 | 0.1374 | 0.8535 | 0.001 |
0.1513 | 27.0 | 7317 | 0.2646 | 0.6470 | 0.7934 | 0.1373 | 0.8595 | 0.001 |
0.1508 | 28.0 | 7588 | 0.2735 | 0.6523 | 0.7933 | 0.1353 | 0.8584 | 0.001 |
0.1508 | 29.0 | 7859 | 0.2776 | 0.6522 | 0.7960 | 0.1362 | 0.8645 | 0.001 |
0.1506 | 30.0 | 8130 | 0.2505 | 0.6283 | 0.7849 | 0.1384 | 0.8546 | 0.001 |
0.1506 | 31.0 | 8401 | 0.2718 | 0.6630 | 0.7964 | 0.1342 | 0.8636 | 0.001 |
0.151 | 32.0 | 8672 | 0.2718 | 0.6556 | 0.7968 | 0.1366 | 0.8696 | 0.001 |
0.151 | 33.0 | 8943 | 0.2824 | 0.6635 | 0.7985 | 0.1359 | 0.8701 | 0.001 |
0.1507 | 34.0 | 9214 | 0.2814 | 0.6400 | 0.7999 | 0.1335 | 0.8657 | 0.001 |
0.1507 | 35.0 | 9485 | 0.2725 | 0.6520 | 0.7963 | 0.1343 | 0.8653 | 0.001 |
0.1495 | 36.0 | 9756 | 0.2636 | 0.6451 | 0.7924 | 0.1429 | 0.8626 | 0.001 |
0.1496 | 37.0 | 10027 | 0.2732 | 0.6531 | 0.7981 | 0.1331 | 0.8638 | 0.001 |
0.1496 | 38.0 | 10298 | 0.2684 | 0.6306 | 0.7938 | 0.1350 | 0.8617 | 0.001 |
0.1503 | 39.0 | 10569 | 0.2800 | 0.6465 | 0.7984 | 0.1352 | 0.8661 | 0.001 |
0.1503 | 40.0 | 10840 | 0.2728 | 0.6271 | 0.7925 | 0.1347 | 0.8594 | 0.001 |
0.1505 | 41.0 | 11111 | 0.2721 | 0.6601 | 0.7935 | 0.1340 | 0.8579 | 0.001 |
0.1505 | 42.0 | 11382 | 0.2711 | 0.6636 | 0.7983 | 0.1322 | 0.8652 | 0.001 |
0.1491 | 43.0 | 11653 | 0.2735 | 0.6493 | 0.7949 | 0.1360 | 0.8635 | 0.001 |
0.1491 | 44.0 | 11924 | 0.2814 | 0.6400 | 0.7955 | 0.1361 | 0.8625 | 0.001 |
0.1507 | 45.0 | 12195 | 0.2814 | 0.6424 | 0.7971 | 0.1328 | 0.8640 | 0.001 |
0.1507 | 46.0 | 12466 | 0.2787 | 0.6469 | 0.7939 | 0.1328 | 0.8581 | 0.001 |
0.1495 | 47.0 | 12737 | 0.2752 | 0.6351 | 0.7977 | 0.1332 | 0.8672 | 0.001 |
0.1498 | 48.0 | 13008 | 0.2817 | 0.6490 | 0.8013 | 0.1325 | 0.8694 | 0.001 |
0.1498 | 49.0 | 13279 | 0.2883 | 0.6738 | 0.8062 | 0.1283 | 0.8710 | 0.0001 |
0.1416 | 50.0 | 13550 | 0.2872 | 0.6734 | 0.8087 | 0.1287 | 0.8747 | 0.0001 |
0.1416 | 51.0 | 13821 | 0.2900 | 0.6714 | 0.8067 | 0.1280 | 0.8706 | 0.0001 |
0.1387 | 52.0 | 14092 | 0.2900 | 0.6744 | 0.8067 | 0.1262 | 0.8702 | 0.0001 |
0.1387 | 53.0 | 14363 | 0.2910 | 0.6764 | 0.8094 | 0.1262 | 0.8729 | 0.0001 |
0.1356 | 54.0 | 14634 | 0.2948 | 0.6744 | 0.8091 | 0.1257 | 0.8702 | 0.0001 |
0.1356 | 55.0 | 14905 | 0.1257 | 0.8106 | 0.6814 | 0.8742 | 0.2948 | 0.0001 |
0.1348 | 56.0 | 15176 | 0.1260 | 0.8108 | 0.6772 | 0.8738 | 0.3010 | 0.0001 |
0.1348 | 57.0 | 15447 | 0.1250 | 0.8129 | 0.6806 | 0.8768 | 0.2986 | 0.0001 |
0.135 | 58.0 | 15718 | 0.1242 | 0.8142 | 0.6859 | 0.8762 | 0.3082 | 0.0001 |
0.135 | 59.0 | 15989 | 0.1245 | 0.8124 | 0.6870 | 0.8763 | 0.3027 | 0.0001 |
0.1334 | 60.0 | 16260 | 0.1242 | 0.8138 | 0.6854 | 0.8772 | 0.3030 | 0.0001 |
0.1335 | 61.0 | 16531 | 0.1240 | 0.8140 | 0.6889 | 0.8756 | 0.3065 | 0.0001 |
0.1335 | 62.0 | 16802 | 0.1249 | 0.8152 | 0.6809 | 0.8798 | 0.3016 | 0.0001 |
0.1308 | 63.0 | 17073 | 0.1233 | 0.8146 | 0.6848 | 0.8757 | 0.3068 | 0.0001 |
0.1308 | 64.0 | 17344 | 0.1234 | 0.8151 | 0.6908 | 0.8769 | 0.3058 | 0.0001 |
0.1326 | 65.0 | 17615 | 0.1233 | 0.8124 | 0.6812 | 0.8735 | 0.3034 | 0.0001 |
0.1326 | 66.0 | 17886 | 0.1232 | 0.8145 | 0.6878 | 0.8788 | 0.3027 | 0.0001 |
0.1306 | 67.0 | 18157 | 0.1228 | 0.8115 | 0.6857 | 0.8707 | 0.3075 | 0.0001 |
0.1306 | 68.0 | 18428 | 0.1226 | 0.8153 | 0.6913 | 0.8767 | 0.3075 | 0.0001 |
0.1299 | 69.0 | 18699 | 0.1227 | 0.8143 | 0.6764 | 0.8751 | 0.3085 | 0.0001 |
0.1299 | 70.0 | 18970 | 0.1230 | 0.8187 | 0.6999 | 0.8838 | 0.3106 | 0.0001 |
0.1295 | 71.0 | 19241 | 0.1225 | 0.8153 | 0.6893 | 0.8756 | 0.3068 | 0.0001 |
0.1289 | 72.0 | 19512 | 0.1223 | 0.8151 | 0.6868 | 0.8776 | 0.3037 | 0.0001 |
0.1289 | 73.0 | 19783 | 0.1223 | 0.8165 | 0.6918 | 0.8782 | 0.3054 | 0.0001 |
0.1279 | 74.0 | 20054 | 0.1225 | 0.8143 | 0.6856 | 0.8747 | 0.3054 | 0.0001 |
0.1279 | 75.0 | 20325 | 0.1221 | 0.8167 | 0.6878 | 0.8784 | 0.3102 | 0.0001 |
0.1276 | 76.0 | 20596 | 0.1217 | 0.8190 | 0.6964 | 0.8812 | 0.3167 | 0.0001 |
0.1276 | 77.0 | 20867 | 0.1217 | 0.8179 | 0.6940 | 0.8796 | 0.3102 | 0.0001 |
0.1274 | 78.0 | 21138 | 0.1216 | 0.8143 | 0.6859 | 0.8735 | 0.3082 | 0.0001 |
0.1274 | 79.0 | 21409 | 0.1215 | 0.8165 | 0.6945 | 0.8766 | 0.3147 | 0.0001 |
0.1269 | 80.0 | 21680 | 0.1214 | 0.8193 | 0.6999 | 0.8803 | 0.3147 | 0.0001 |
0.1269 | 81.0 | 21951 | 0.1214 | 0.8194 | 0.6974 | 0.8828 | 0.3113 | 0.0001 |
0.1259 | 82.0 | 22222 | 0.1212 | 0.8171 | 0.6956 | 0.8782 | 0.3102 | 0.0001 |
0.1259 | 83.0 | 22493 | 0.1208 | 0.8190 | 0.6970 | 0.8791 | 0.3123 | 0.0001 |
0.1258 | 84.0 | 22764 | 0.1209 | 0.8204 | 0.6997 | 0.8813 | 0.3154 | 0.0001 |
0.1251 | 85.0 | 23035 | 0.1211 | 0.8163 | 0.6935 | 0.8752 | 0.3065 | 0.0001 |
0.1251 | 86.0 | 23306 | 0.1203 | 0.8201 | 0.6972 | 0.8804 | 0.3154 | 0.0001 |
0.1251 | 87.0 | 23577 | 0.1208 | 0.8182 | 0.6947 | 0.8785 | 0.3150 | 0.0001 |
0.1251 | 88.0 | 23848 | 0.1214 | 0.8181 | 0.6937 | 0.8788 | 0.3154 | 0.0001 |
0.1246 | 89.0 | 24119 | 0.1206 | 0.8201 | 0.6953 | 0.8797 | 0.3106 | 0.0001 |
0.1246 | 90.0 | 24390 | 0.1210 | 0.8214 | 0.6960 | 0.8819 | 0.3164 | 0.0001 |
0.1239 | 91.0 | 24661 | 0.1199 | 0.8202 | 0.7006 | 0.8805 | 0.3154 | 0.0001 |
0.1239 | 92.0 | 24932 | 0.1208 | 0.8222 | 0.7039 | 0.8856 | 0.3161 | 0.0001 |
0.1238 | 93.0 | 25203 | 0.1204 | 0.8199 | 0.7004 | 0.8808 | 0.3133 | 0.0001 |
0.1238 | 94.0 | 25474 | 0.1200 | 0.8230 | 0.7036 | 0.8847 | 0.3143 | 0.0001 |
0.1237 | 95.0 | 25745 | 0.1206 | 0.8209 | 0.7069 | 0.8817 | 0.3188 | 0.0001 |
0.1234 | 96.0 | 26016 | 0.1201 | 0.8222 | 0.7060 | 0.8820 | 0.3147 | 0.0001 |
0.1234 | 97.0 | 26287 | 0.1204 | 0.8208 | 0.7074 | 0.8830 | 0.3092 | 0.0001 |
0.1215 | 98.0 | 26558 | 0.1200 | 0.8241 | 0.7125 | 0.8859 | 0.3188 | 1e-05 |
0.1215 | 99.0 | 26829 | 0.1195 | 0.8247 | 0.7127 | 0.8864 | 0.3171 | 1e-05 |
0.1208 | 100.0 | 27100 | 0.1192 | 0.8225 | 0.7077 | 0.8818 | 0.3164 | 1e-05 |
0.1208 | 101.0 | 27371 | 0.1193 | 0.8232 | 0.7060 | 0.8831 | 0.3171 | 1e-05 |
0.1195 | 102.0 | 27642 | 0.1197 | 0.8238 | 0.7105 | 0.8848 | 0.3185 | 1e-05 |
0.1195 | 103.0 | 27913 | 0.1191 | 0.8216 | 0.7076 | 0.8805 | 0.3140 | 1e-05 |
0.1197 | 104.0 | 28184 | 0.1193 | 0.8239 | 0.7063 | 0.8843 | 0.3202 | 1e-05 |
0.1197 | 105.0 | 28455 | 0.1190 | 0.8213 | 0.7071 | 0.8799 | 0.3126 | 1e-05 |
0.1189 | 106.0 | 28726 | 0.1190 | 0.8233 | 0.7061 | 0.8835 | 0.3202 | 1e-05 |
0.1189 | 107.0 | 28997 | 0.1194 | 0.8224 | 0.7038 | 0.8811 | 0.3164 | 1e-05 |
0.1194 | 108.0 | 29268 | 0.1191 | 0.8232 | 0.7110 | 0.8830 | 0.3191 | 1e-05 |
0.1187 | 109.0 | 29539 | 0.1189 | 0.8230 | 0.7101 | 0.8817 | 0.3174 | 1e-05 |
0.1187 | 110.0 | 29810 | 0.1192 | 0.8224 | 0.7044 | 0.8810 | 0.3161 | 1e-05 |
0.1185 | 111.0 | 30081 | 0.1192 | 0.8226 | 0.7083 | 0.8827 | 0.3174 | 1e-05 |
0.1185 | 112.0 | 30352 | 0.1190 | 0.8239 | 0.7093 | 0.8841 | 0.3205 | 1e-05 |
0.119 | 113.0 | 30623 | 0.1195 | 0.8233 | 0.7080 | 0.8845 | 0.3171 | 1e-05 |
0.119 | 114.0 | 30894 | 0.1190 | 0.8220 | 0.7062 | 0.8799 | 0.3181 | 1e-05 |
0.1182 | 115.0 | 31165 | 0.1192 | 0.8229 | 0.7081 | 0.8823 | 0.3174 | 1e-05 |
0.1182 | 116.0 | 31436 | 0.1190 | 0.8256 | 0.7128 | 0.8862 | 0.3250 | 0.0000 |
0.1191 | 117.0 | 31707 | 0.1187 | 0.8231 | 0.7104 | 0.8821 | 0.3171 | 0.0000 |
0.1191 | 118.0 | 31978 | 0.1189 | 0.8236 | 0.7061 | 0.8830 | 0.3198 | 0.0000 |
0.1179 | 119.0 | 32249 | 0.1189 | 0.8233 | 0.7080 | 0.8830 | 0.3181 | 0.0000 |
0.1176 | 120.0 | 32520 | 0.1190 | 0.8239 | 0.7101 | 0.8838 | 0.3185 | 0.0000 |
0.1176 | 121.0 | 32791 | 0.1195 | 0.8254 | 0.7128 | 0.8872 | 0.3209 | 0.0000 |
0.1175 | 122.0 | 33062 | 0.1192 | 0.8223 | 0.7048 | 0.8813 | 0.3154 | 0.0000 |
0.1175 | 123.0 | 33333 | 0.1192 | 0.8255 | 0.7154 | 0.8856 | 0.3212 | 0.0000 |
0.1176 | 124.0 | 33604 | 0.1189 | 0.8239 | 0.7109 | 0.8837 | 0.3209 | 0.0000 |
0.1176 | 125.0 | 33875 | 0.1189 | 0.8252 | 0.7102 | 0.8847 | 0.3226 | 0.0000 |
0.1179 | 126.0 | 34146 | 0.1189 | 0.8206 | 0.7025 | 0.8787 | 0.3164 | 0.0000 |
0.1179 | 127.0 | 34417 | 0.1190 | 0.8245 | 0.7104 | 0.8839 | 0.3216 | 0.0000 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu118
- Datasets 2.18.0
- Tokenizers 0.15.0
- Downloads last month
- 33
Model tree for lombardata/DinoVdeau-large-2024_04_03-with_data_aug_batch-size32_epochs150_freeze
Base model
facebook/dinov2-large