segformer-b4-finetuned-ade-512-512_corm

This model is a fine-tuned version of nvidia/segformer-b4-finetuned-ade-512-512 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0391
  • Mean Iou: 0.9261
  • Mean Accuracy: 0.9594
  • Overall Accuracy: 0.9863
  • Accuracy Background: 0.9977
  • Accuracy Corm: 0.9268
  • Accuracy Damage: 0.9537
  • Iou Background: 0.9944
  • Iou Corm: 0.8758
  • Iou Damage: 0.9082

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Corm Accuracy Damage Iou Background Iou Corm Iou Damage
1.0242 0.6061 20 1.0187 0.3316 0.6012 0.5573 0.5345 0.5453 0.7237 0.5341 0.1224 0.3383
0.7686 1.2121 40 0.6913 0.7059 0.8624 0.9170 0.9369 0.7070 0.9434 0.9369 0.4718 0.7090
0.5281 1.8182 60 0.4782 0.8060 0.9165 0.9537 0.9697 0.8764 0.9032 0.9696 0.6651 0.7833
0.3931 2.4242 80 0.3279 0.8530 0.9308 0.9690 0.9843 0.8578 0.9503 0.9837 0.7547 0.8206
0.2574 3.0303 100 0.2112 0.8733 0.9335 0.9753 0.9915 0.8406 0.9685 0.9899 0.7898 0.8402
0.2112 3.6364 120 0.1588 0.8990 0.9450 0.9807 0.9952 0.8824 0.9576 0.9918 0.8337 0.8716
0.1545 4.2424 140 0.1198 0.8960 0.9398 0.9805 0.9965 0.8539 0.9690 0.9924 0.8245 0.8711
0.1127 4.8485 160 0.1152 0.8851 0.9395 0.9782 0.9973 0.9609 0.8604 0.9923 0.8191 0.8440
0.1147 5.4545 180 0.0862 0.9130 0.9546 0.9834 0.9956 0.9170 0.9513 0.9930 0.8579 0.8881
0.0945 6.0606 200 0.0793 0.9083 0.9457 0.9829 0.9977 0.8728 0.9667 0.9929 0.8437 0.8881
0.0942 6.6667 220 0.0730 0.9170 0.9519 0.9842 0.9985 0.9263 0.9309 0.9926 0.8624 0.8960
0.0766 7.2727 240 0.0675 0.9185 0.9565 0.9847 0.9972 0.9365 0.9358 0.9936 0.8656 0.8963
0.0674 7.8788 260 0.0635 0.9160 0.9523 0.9844 0.9972 0.8897 0.9700 0.9937 0.8585 0.8957
0.0662 8.4848 280 0.0593 0.9199 0.9520 0.9849 0.9985 0.8984 0.9590 0.9931 0.8637 0.9030
0.0683 9.0909 300 0.0582 0.9176 0.9516 0.9847 0.9981 0.8914 0.9652 0.9937 0.8598 0.8994
0.0591 9.6970 320 0.0548 0.9222 0.9565 0.9855 0.9974 0.9076 0.9644 0.9941 0.8693 0.9031
0.0649 10.3030 340 0.0541 0.9201 0.9553 0.9849 0.9984 0.9407 0.9269 0.9935 0.8685 0.8982
0.0622 10.9091 360 0.0525 0.9155 0.9497 0.9844 0.9982 0.8803 0.9707 0.9938 0.8551 0.8976
0.0637 11.5152 380 0.0542 0.9124 0.9466 0.9838 0.9986 0.8720 0.9692 0.9933 0.8491 0.8948
0.0647 12.1212 400 0.0486 0.9244 0.9562 0.9857 0.9988 0.9340 0.9358 0.9933 0.8738 0.9060
0.0418 12.7273 420 0.0466 0.9267 0.9589 0.9862 0.9980 0.9288 0.9500 0.9939 0.8773 0.9088
0.0459 13.3333 440 0.0460 0.9260 0.9601 0.9862 0.9977 0.9378 0.9448 0.9942 0.8770 0.9069
0.048 13.9394 460 0.0453 0.9253 0.9586 0.9861 0.9981 0.9335 0.9443 0.9941 0.8751 0.9067
0.0397 14.5455 480 0.0446 0.9263 0.9589 0.9863 0.9978 0.9219 0.9569 0.9943 0.8759 0.9088
0.0546 15.1515 500 0.0457 0.9219 0.9572 0.9854 0.9983 0.9447 0.9285 0.9940 0.8707 0.9010
0.0427 15.7576 520 0.0432 0.9267 0.9611 0.9863 0.9973 0.9374 0.9485 0.9943 0.8774 0.9084
0.0463 16.3636 540 0.0424 0.9263 0.9576 0.9863 0.9982 0.9148 0.9598 0.9941 0.8750 0.9098
0.048 16.9697 560 0.0421 0.9272 0.9588 0.9865 0.9981 0.9229 0.9555 0.9942 0.8776 0.9099
0.0534 17.5758 580 0.0420 0.9269 0.9610 0.9863 0.9975 0.9393 0.9460 0.9943 0.8777 0.9085
0.0411 18.1818 600 0.0420 0.9249 0.9581 0.9861 0.9976 0.9133 0.9634 0.9944 0.8734 0.9068
0.0417 18.7879 620 0.0425 0.9231 0.9573 0.9858 0.9973 0.9042 0.9705 0.9945 0.8699 0.9048
0.0368 19.3939 640 0.0405 0.9283 0.9603 0.9867 0.9979 0.9281 0.9548 0.9944 0.8795 0.9111
0.048 20.0 660 0.0399 0.9279 0.9608 0.9866 0.9977 0.9325 0.9522 0.9944 0.8793 0.9100
0.0363 20.6061 680 0.0415 0.9255 0.9593 0.9861 0.9981 0.9421 0.9378 0.9943 0.8761 0.9061
0.0459 21.2121 700 0.0421 0.9243 0.9583 0.9858 0.9984 0.9471 0.9294 0.9940 0.8748 0.9042
0.0436 21.8182 720 0.0403 0.9269 0.9610 0.9863 0.9976 0.9426 0.9429 0.9943 0.8782 0.9080
0.0461 22.4242 740 0.0406 0.9260 0.9615 0.9862 0.9974 0.9476 0.9396 0.9945 0.8771 0.9065
0.0319 23.0303 760 0.0395 0.9269 0.9614 0.9864 0.9970 0.9306 0.9567 0.9944 0.8780 0.9082
0.0366 23.6364 780 0.0392 0.9277 0.9607 0.9866 0.9978 0.9352 0.9492 0.9944 0.8793 0.9095
0.0351 24.2424 800 0.0390 0.9282 0.9605 0.9866 0.9979 0.9338 0.9497 0.9943 0.8796 0.9106
0.0322 24.8485 820 0.0388 0.9280 0.9600 0.9866 0.9980 0.9289 0.9531 0.9944 0.8790 0.9108
0.0346 25.4545 840 0.0392 0.9266 0.9595 0.9864 0.9976 0.9173 0.9635 0.9944 0.8760 0.9095
0.0342 26.0606 860 0.0398 0.9243 0.9575 0.9861 0.9976 0.9059 0.9691 0.9945 0.8715 0.9070
0.0389 26.6667 880 0.0387 0.9275 0.9617 0.9865 0.9970 0.9272 0.9609 0.9945 0.8793 0.9087
0.033 27.2727 900 0.0392 0.9272 0.9596 0.9865 0.9976 0.9168 0.9645 0.9944 0.8772 0.9099
0.0316 27.8788 920 0.0388 0.9269 0.9602 0.9865 0.9973 0.9196 0.9636 0.9945 0.8768 0.9095
0.0391 28.4848 940 0.0396 0.9262 0.9604 0.9863 0.9970 0.9200 0.9642 0.9945 0.8760 0.9082
0.0305 29.0909 960 0.0386 0.9275 0.9600 0.9865 0.9978 0.9241 0.9580 0.9945 0.8780 0.9101
0.034 29.6970 980 0.0392 0.9267 0.9600 0.9863 0.9980 0.9399 0.9421 0.9943 0.8777 0.9081
0.0322 30.3030 1000 0.0383 0.9275 0.9607 0.9865 0.9976 0.9321 0.9524 0.9945 0.8786 0.9094
0.0288 30.9091 1020 0.0389 0.9271 0.9606 0.9864 0.9976 0.9339 0.9504 0.9944 0.8782 0.9087
0.0324 31.5152 1040 0.0394 0.9265 0.9601 0.9863 0.9978 0.9362 0.9462 0.9944 0.8770 0.9080
0.0329 32.1212 1060 0.0399 0.9259 0.9599 0.9862 0.9980 0.9421 0.9396 0.9943 0.8767 0.9068
0.0211 32.7273 1080 0.0390 0.9268 0.9593 0.9864 0.9981 0.9310 0.9490 0.9943 0.8773 0.9087
0.0227 33.3333 1100 0.0389 0.9269 0.9581 0.9864 0.9983 0.9194 0.9565 0.9941 0.8764 0.9101
0.0328 33.9394 1120 0.0391 0.9270 0.9587 0.9864 0.9983 0.9284 0.9494 0.9941 0.8773 0.9096
0.0297 34.5455 1140 0.0389 0.9267 0.9597 0.9864 0.9979 0.9304 0.9509 0.9944 0.8771 0.9087
0.0346 35.1515 1160 0.0390 0.9267 0.9595 0.9864 0.9979 0.9292 0.9516 0.9943 0.8769 0.9088
0.0231 35.7576 1180 0.0391 0.9266 0.9587 0.9863 0.9981 0.9232 0.9547 0.9942 0.8764 0.9093
0.0301 36.3636 1200 0.0387 0.9267 0.9594 0.9864 0.9978 0.9232 0.9572 0.9944 0.8764 0.9093
0.0331 36.9697 1220 0.0388 0.9269 0.9597 0.9864 0.9979 0.9290 0.9522 0.9943 0.8772 0.9091
0.0281 37.5758 1240 0.0389 0.9268 0.9589 0.9864 0.9981 0.9266 0.9520 0.9943 0.8769 0.9093
0.0208 38.1818 1260 0.0390 0.9266 0.9605 0.9863 0.9975 0.9318 0.9523 0.9944 0.8768 0.9086
0.0348 38.7879 1280 0.0397 0.9257 0.9598 0.9862 0.9978 0.9387 0.9429 0.9943 0.8760 0.9068
0.0276 39.3939 1300 0.0388 0.9269 0.9590 0.9864 0.9981 0.9267 0.9522 0.9942 0.8772 0.9093
0.0286 40.0 1320 0.0395 0.9248 0.9572 0.9861 0.9979 0.9082 0.9655 0.9944 0.8723 0.9076
0.0298 40.6061 1340 0.0391 0.9263 0.9592 0.9863 0.9977 0.9212 0.9587 0.9944 0.8759 0.9087
0.0235 41.2121 1360 0.0389 0.9262 0.9590 0.9863 0.9978 0.9220 0.9574 0.9944 0.8757 0.9085
0.0223 41.8182 1380 0.0392 0.9265 0.9593 0.9863 0.9979 0.9273 0.9528 0.9943 0.8765 0.9088
0.0216 42.4242 1400 0.0390 0.9264 0.9592 0.9863 0.9979 0.9265 0.9531 0.9943 0.8761 0.9086
0.027 43.0303 1420 0.0395 0.9264 0.9601 0.9863 0.9976 0.9315 0.9511 0.9944 0.8764 0.9083
0.0261 43.6364 1440 0.0392 0.9265 0.9593 0.9863 0.9980 0.9309 0.9491 0.9943 0.8768 0.9083
0.0213 44.2424 1460 0.0390 0.9262 0.9592 0.9863 0.9977 0.9213 0.9584 0.9944 0.8756 0.9087
0.0228 44.8485 1480 0.0390 0.9260 0.9596 0.9863 0.9976 0.9240 0.9571 0.9944 0.8756 0.9081
0.0317 45.4545 1500 0.0391 0.9259 0.9584 0.9862 0.9980 0.9227 0.9544 0.9942 0.8751 0.9082
0.0244 46.0606 1520 0.0392 0.9261 0.9591 0.9863 0.9979 0.9274 0.9521 0.9943 0.8759 0.9081
0.0282 46.6667 1540 0.0391 0.9259 0.9589 0.9862 0.9978 0.9224 0.9564 0.9944 0.8750 0.9082
0.0244 47.2727 1560 0.0397 0.9262 0.9592 0.9863 0.9979 0.9280 0.9519 0.9944 0.8760 0.9081
0.0265 47.8788 1580 0.0393 0.9258 0.9592 0.9862 0.9977 0.9226 0.9572 0.9944 0.8751 0.9080
0.0282 48.4848 1600 0.0394 0.9260 0.9585 0.9863 0.9980 0.9209 0.9567 0.9943 0.8752 0.9084
0.0229 49.0909 1620 0.0390 0.9262 0.9592 0.9863 0.9979 0.9281 0.9517 0.9943 0.8760 0.9083
0.0236 49.6970 1640 0.0391 0.9261 0.9594 0.9863 0.9977 0.9268 0.9537 0.9944 0.8758 0.9082

Framework versions

  • Transformers 4.44.1
  • Pytorch 2.6.0+cpu
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
36
Safetensors
Model size
64M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for mujerry/segformer-b4-finetuned-ade-512-512_corm

Finetuned
(6)
this model