segformer-b0-finetuned-cityscapes-1024-1024_corm
This model is a fine-tuned version of nvidia/segformer-b0-finetuned-cityscapes-1024-1024 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0506
- Mean Iou: 0.9091
- Mean Accuracy: 0.9514
- Overall Accuracy: 0.9835
- Accuracy Background: 0.9977
- Accuracy Corm: 0.9256
- Accuracy Damage: 0.9308
- Iou Background: 0.9938
- Iou Corm: 0.8393
- Iou Damage: 0.8942
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Corm | Accuracy Damage | Iou Background | Iou Corm | Iou Damage |
---|---|---|---|---|---|---|---|---|---|---|---|---|
0.8589 | 0.6061 | 20 | 0.8411 | 0.4729 | 0.5967 | 0.8533 | 0.9785 | 0.5630 | 0.2486 | 0.9291 | 0.2623 | 0.2273 |
0.7143 | 1.2121 | 40 | 0.6737 | 0.6855 | 0.8154 | 0.9261 | 0.9743 | 0.7133 | 0.7586 | 0.9723 | 0.4438 | 0.6403 |
0.5785 | 1.8182 | 60 | 0.5154 | 0.7300 | 0.8494 | 0.9393 | 0.9707 | 0.6523 | 0.9253 | 0.9704 | 0.4854 | 0.7342 |
0.4449 | 2.4242 | 80 | 0.3898 | 0.8138 | 0.9106 | 0.9596 | 0.9778 | 0.8207 | 0.9333 | 0.9771 | 0.6594 | 0.8048 |
0.3313 | 3.0303 | 100 | 0.3061 | 0.8287 | 0.9246 | 0.9633 | 0.9820 | 0.9175 | 0.8743 | 0.9809 | 0.7007 | 0.8046 |
0.2799 | 3.6364 | 120 | 0.2367 | 0.8545 | 0.9230 | 0.9717 | 0.9907 | 0.8453 | 0.9331 | 0.9882 | 0.7423 | 0.8330 |
0.2482 | 4.2424 | 140 | 0.2088 | 0.8669 | 0.9301 | 0.9746 | 0.9927 | 0.8707 | 0.9269 | 0.9899 | 0.7653 | 0.8455 |
0.2134 | 4.8485 | 160 | 0.1841 | 0.8637 | 0.9343 | 0.9739 | 0.9939 | 0.9403 | 0.8687 | 0.9908 | 0.7666 | 0.8337 |
0.1695 | 5.4545 | 180 | 0.1585 | 0.8832 | 0.9405 | 0.9780 | 0.9943 | 0.9053 | 0.9221 | 0.9916 | 0.7954 | 0.8625 |
0.1581 | 6.0606 | 200 | 0.1414 | 0.8895 | 0.9410 | 0.9795 | 0.9959 | 0.8998 | 0.9273 | 0.9922 | 0.8051 | 0.8711 |
0.1413 | 6.6667 | 220 | 0.1253 | 0.8943 | 0.9430 | 0.9805 | 0.9966 | 0.9060 | 0.9263 | 0.9926 | 0.8141 | 0.8763 |
0.1125 | 7.2727 | 240 | 0.1138 | 0.8955 | 0.9453 | 0.9807 | 0.9965 | 0.9193 | 0.9203 | 0.9927 | 0.8169 | 0.8770 |
0.1195 | 7.8788 | 260 | 0.1124 | 0.8811 | 0.9411 | 0.9779 | 0.9967 | 0.9503 | 0.8763 | 0.9926 | 0.7970 | 0.8537 |
0.1032 | 8.4848 | 280 | 0.1049 | 0.8912 | 0.9457 | 0.9798 | 0.9964 | 0.9413 | 0.8994 | 0.9927 | 0.8117 | 0.8692 |
0.104 | 9.0909 | 300 | 0.0912 | 0.9013 | 0.9459 | 0.9819 | 0.9971 | 0.9070 | 0.9337 | 0.9929 | 0.8255 | 0.8856 |
0.0994 | 9.6970 | 320 | 0.0887 | 0.9022 | 0.9473 | 0.9820 | 0.9972 | 0.9172 | 0.9275 | 0.9930 | 0.8275 | 0.8860 |
0.0914 | 10.3030 | 340 | 0.0866 | 0.8999 | 0.9471 | 0.9815 | 0.9974 | 0.9278 | 0.9160 | 0.9930 | 0.8249 | 0.8819 |
0.0917 | 10.9091 | 360 | 0.0813 | 0.9032 | 0.9473 | 0.9822 | 0.9975 | 0.9166 | 0.9279 | 0.9930 | 0.8293 | 0.8873 |
0.0822 | 11.5152 | 380 | 0.0774 | 0.9038 | 0.9454 | 0.9825 | 0.9972 | 0.8900 | 0.9490 | 0.9932 | 0.8280 | 0.8903 |
0.078 | 12.1212 | 400 | 0.0766 | 0.9035 | 0.9488 | 0.9823 | 0.9973 | 0.9244 | 0.9247 | 0.9932 | 0.8301 | 0.8871 |
0.0782 | 12.7273 | 420 | 0.0739 | 0.9027 | 0.9490 | 0.9822 | 0.9974 | 0.9302 | 0.9195 | 0.9933 | 0.8293 | 0.8856 |
0.0759 | 13.3333 | 440 | 0.0715 | 0.9025 | 0.9487 | 0.9821 | 0.9975 | 0.9316 | 0.9170 | 0.9933 | 0.8292 | 0.8851 |
0.066 | 13.9394 | 460 | 0.0691 | 0.9059 | 0.9480 | 0.9828 | 0.9975 | 0.9089 | 0.9377 | 0.9934 | 0.8328 | 0.8915 |
0.0774 | 14.5455 | 480 | 0.0674 | 0.9059 | 0.9493 | 0.9827 | 0.9976 | 0.9237 | 0.9267 | 0.9933 | 0.8339 | 0.8904 |
0.0719 | 15.1515 | 500 | 0.0690 | 0.9034 | 0.9488 | 0.9823 | 0.9977 | 0.9305 | 0.9183 | 0.9933 | 0.8307 | 0.8862 |
0.0713 | 15.7576 | 520 | 0.0666 | 0.9003 | 0.9486 | 0.9817 | 0.9974 | 0.9368 | 0.9117 | 0.9935 | 0.8262 | 0.8814 |
0.0647 | 16.3636 | 540 | 0.0645 | 0.9033 | 0.9498 | 0.9823 | 0.9974 | 0.9346 | 0.9173 | 0.9934 | 0.8307 | 0.8858 |
0.0576 | 16.9697 | 560 | 0.0637 | 0.9046 | 0.9499 | 0.9826 | 0.9975 | 0.9301 | 0.9221 | 0.9936 | 0.8323 | 0.8879 |
0.0598 | 17.5758 | 580 | 0.0625 | 0.9044 | 0.9501 | 0.9825 | 0.9974 | 0.9333 | 0.9197 | 0.9935 | 0.8321 | 0.8875 |
0.0676 | 18.1818 | 600 | 0.0644 | 0.8991 | 0.9489 | 0.9815 | 0.9974 | 0.9447 | 0.9047 | 0.9936 | 0.8243 | 0.8794 |
0.0474 | 18.7879 | 620 | 0.0624 | 0.9018 | 0.9503 | 0.9820 | 0.9973 | 0.9427 | 0.9108 | 0.9936 | 0.8285 | 0.8832 |
0.0611 | 19.3939 | 640 | 0.0606 | 0.9064 | 0.9504 | 0.9829 | 0.9976 | 0.9282 | 0.9254 | 0.9936 | 0.8350 | 0.8905 |
0.058 | 20.0 | 660 | 0.0596 | 0.9048 | 0.9508 | 0.9826 | 0.9973 | 0.9355 | 0.9197 | 0.9936 | 0.8330 | 0.8877 |
0.0574 | 20.6061 | 680 | 0.0575 | 0.9082 | 0.9484 | 0.9834 | 0.9973 | 0.8972 | 0.9507 | 0.9938 | 0.8356 | 0.8951 |
0.0562 | 21.2121 | 700 | 0.0576 | 0.9065 | 0.9465 | 0.9831 | 0.9973 | 0.8870 | 0.9553 | 0.9937 | 0.8319 | 0.8939 |
0.0551 | 21.8182 | 720 | 0.0571 | 0.9067 | 0.9515 | 0.9830 | 0.9972 | 0.9299 | 0.9274 | 0.9938 | 0.8361 | 0.8903 |
0.0498 | 22.4242 | 740 | 0.0564 | 0.9090 | 0.9509 | 0.9834 | 0.9976 | 0.9217 | 0.9336 | 0.9936 | 0.8388 | 0.8945 |
0.0566 | 23.0303 | 760 | 0.0554 | 0.9067 | 0.9511 | 0.9830 | 0.9975 | 0.9307 | 0.9251 | 0.9938 | 0.8359 | 0.8904 |
0.0436 | 23.6364 | 780 | 0.0567 | 0.9056 | 0.9509 | 0.9827 | 0.9976 | 0.9362 | 0.9191 | 0.9936 | 0.8344 | 0.8889 |
0.0586 | 24.2424 | 800 | 0.0548 | 0.9081 | 0.9515 | 0.9832 | 0.9974 | 0.9273 | 0.9296 | 0.9938 | 0.8380 | 0.8924 |
0.0497 | 24.8485 | 820 | 0.0549 | 0.9091 | 0.9511 | 0.9834 | 0.9976 | 0.9227 | 0.9331 | 0.9937 | 0.8390 | 0.8946 |
0.0535 | 25.4545 | 840 | 0.0544 | 0.9073 | 0.9510 | 0.9831 | 0.9976 | 0.9296 | 0.9256 | 0.9937 | 0.8368 | 0.8913 |
0.0514 | 26.0606 | 860 | 0.0539 | 0.9096 | 0.9514 | 0.9836 | 0.9975 | 0.9205 | 0.9362 | 0.9938 | 0.8399 | 0.8953 |
0.0684 | 26.6667 | 880 | 0.0550 | 0.9055 | 0.9511 | 0.9827 | 0.9976 | 0.9383 | 0.9174 | 0.9937 | 0.8344 | 0.8884 |
0.0542 | 27.2727 | 900 | 0.0524 | 0.9100 | 0.9512 | 0.9836 | 0.9976 | 0.9196 | 0.9364 | 0.9938 | 0.8403 | 0.8959 |
0.0455 | 27.8788 | 920 | 0.0534 | 0.9083 | 0.9518 | 0.9833 | 0.9975 | 0.9296 | 0.9282 | 0.9938 | 0.8384 | 0.8929 |
0.0512 | 28.4848 | 940 | 0.0525 | 0.9095 | 0.9504 | 0.9836 | 0.9977 | 0.9149 | 0.9386 | 0.9938 | 0.8393 | 0.8954 |
0.0486 | 29.0909 | 960 | 0.0524 | 0.9083 | 0.9516 | 0.9833 | 0.9976 | 0.9292 | 0.9280 | 0.9938 | 0.8383 | 0.8927 |
0.0486 | 29.6970 | 980 | 0.0517 | 0.9099 | 0.9509 | 0.9836 | 0.9976 | 0.9172 | 0.9380 | 0.9938 | 0.8401 | 0.8958 |
0.0388 | 30.3030 | 1000 | 0.0514 | 0.9101 | 0.9510 | 0.9837 | 0.9975 | 0.9147 | 0.9408 | 0.9938 | 0.8402 | 0.8962 |
0.0571 | 30.9091 | 1020 | 0.0518 | 0.9091 | 0.9513 | 0.9834 | 0.9977 | 0.9247 | 0.9314 | 0.9938 | 0.8392 | 0.8942 |
0.0515 | 31.5152 | 1040 | 0.0511 | 0.9103 | 0.9510 | 0.9837 | 0.9975 | 0.9135 | 0.9418 | 0.9939 | 0.8404 | 0.8966 |
0.049 | 32.1212 | 1060 | 0.0517 | 0.9100 | 0.9510 | 0.9836 | 0.9976 | 0.9171 | 0.9385 | 0.9938 | 0.8402 | 0.8958 |
0.0533 | 32.7273 | 1080 | 0.0513 | 0.9095 | 0.9514 | 0.9835 | 0.9975 | 0.9221 | 0.9346 | 0.9938 | 0.8398 | 0.8949 |
0.0443 | 33.3333 | 1100 | 0.0513 | 0.9092 | 0.9513 | 0.9835 | 0.9977 | 0.9245 | 0.9317 | 0.9938 | 0.8395 | 0.8944 |
0.0573 | 33.9394 | 1120 | 0.0516 | 0.9089 | 0.9515 | 0.9834 | 0.9976 | 0.9270 | 0.9297 | 0.9938 | 0.8390 | 0.8938 |
0.0421 | 34.5455 | 1140 | 0.0516 | 0.9082 | 0.9512 | 0.9833 | 0.9977 | 0.9294 | 0.9264 | 0.9938 | 0.8382 | 0.8927 |
0.0509 | 35.1515 | 1160 | 0.0503 | 0.9102 | 0.9508 | 0.9837 | 0.9976 | 0.9145 | 0.9403 | 0.9938 | 0.8403 | 0.8966 |
0.0854 | 35.7576 | 1180 | 0.0511 | 0.9087 | 0.9518 | 0.9834 | 0.9975 | 0.9285 | 0.9293 | 0.9938 | 0.8388 | 0.8934 |
0.0522 | 36.3636 | 1200 | 0.0508 | 0.9089 | 0.9516 | 0.9834 | 0.9976 | 0.9269 | 0.9302 | 0.9938 | 0.8392 | 0.8938 |
0.0648 | 36.9697 | 1220 | 0.0503 | 0.9103 | 0.9514 | 0.9837 | 0.9975 | 0.9175 | 0.9391 | 0.9939 | 0.8408 | 0.8964 |
0.0513 | 37.5758 | 1240 | 0.0502 | 0.9099 | 0.9511 | 0.9836 | 0.9977 | 0.9203 | 0.9353 | 0.9938 | 0.8402 | 0.8957 |
0.0494 | 38.1818 | 1260 | 0.0512 | 0.9093 | 0.9516 | 0.9835 | 0.9976 | 0.9257 | 0.9316 | 0.9938 | 0.8396 | 0.8944 |
0.0513 | 38.7879 | 1280 | 0.0510 | 0.9096 | 0.9517 | 0.9836 | 0.9975 | 0.9232 | 0.9343 | 0.9939 | 0.8400 | 0.8949 |
0.0573 | 39.3939 | 1300 | 0.0508 | 0.9092 | 0.9514 | 0.9835 | 0.9976 | 0.9249 | 0.9318 | 0.9938 | 0.8395 | 0.8943 |
0.0627 | 40.0 | 1320 | 0.0506 | 0.9091 | 0.9514 | 0.9835 | 0.9977 | 0.9256 | 0.9308 | 0.9938 | 0.8393 | 0.8942 |
Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 122
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.