Edit model card

segformer-b1-finetuned-cityscapes-1024-1024-latestt

This model is a fine-tuned version of nvidia/segformer-b1-finetuned-cityscapes-1024-1024 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0311
  • Mean Iou: 0.9368
  • Mean Accuracy: 0.9605
  • Overall Accuracy: 0.9893
  • Accuracy Default: 1e-06
  • Accuracy Pipe: 0.8961
  • Accuracy Floor: 0.9893
  • Accuracy Background: 0.9960
  • Iou Default: 1e-06
  • Iou Pipe: 0.8401
  • Iou Floor: 0.9817
  • Iou Background: 0.9886

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 3
  • eval_batch_size: 3
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Default Accuracy Pipe Accuracy Floor Accuracy Background Iou Default Iou Pipe Iou Floor Iou Background
0.5401 1.0 36 0.2151 0.7616 0.8215 0.9549 1e-06 0.5060 0.9791 0.9793 1e-06 0.4188 0.9090 0.9571
0.1576 2.0 72 0.1166 0.8481 0.8906 0.9737 1e-06 0.7051 0.9739 0.9929 1e-06 0.6121 0.9598 0.9724
0.0941 3.0 108 0.0739 0.8869 0.9301 0.9802 1e-06 0.8180 0.9807 0.9916 1e-06 0.7164 0.9646 0.9798
0.0678 4.0 144 0.0600 0.9001 0.9439 0.9824 1e-06 0.8588 0.9814 0.9916 1e-06 0.7499 0.9685 0.9820
0.0566 5.0 180 0.0493 0.9087 0.9531 0.9841 1e-06 0.8887 0.9778 0.9929 1e-06 0.7711 0.9711 0.9840
0.0463 6.0 216 0.0454 0.9115 0.9455 0.9847 1e-06 0.8608 0.9809 0.9949 1e-06 0.7769 0.9735 0.9840
0.0412 7.0 252 0.0425 0.9138 0.9469 0.9854 1e-06 0.8611 0.9851 0.9945 1e-06 0.7823 0.9739 0.9852
0.038 8.0 288 0.0379 0.9226 0.9550 0.9868 1e-06 0.8864 0.9836 0.9951 1e-06 0.8049 0.9766 0.9864
0.0337 9.0 324 0.0392 0.9184 0.9468 0.9862 1e-06 0.8626 0.9812 0.9966 1e-06 0.7935 0.9765 0.9853
0.0323 10.0 360 0.0350 0.9246 0.9544 0.9872 1e-06 0.8827 0.9852 0.9953 1e-06 0.8095 0.9779 0.9864
0.0294 11.0 396 0.0350 0.9253 0.9523 0.9873 1e-06 0.8725 0.9898 0.9947 1e-06 0.8114 0.9783 0.9863
0.0275 12.0 432 0.0379 0.9185 0.9461 0.9862 1e-06 0.8606 0.9810 0.9968 1e-06 0.7950 0.9748 0.9857
0.0279 13.0 468 0.0333 0.9267 0.9572 0.9875 1e-06 0.8914 0.9853 0.9951 1e-06 0.8160 0.9770 0.9871
0.0269 14.0 504 0.0323 0.9267 0.9495 0.9878 1e-06 0.8640 0.9878 0.9967 1e-06 0.8141 0.9790 0.9872
0.0239 15.0 540 0.0300 0.9324 0.9570 0.9886 1e-06 0.8864 0.9887 0.9959 1e-06 0.8290 0.9802 0.9880
0.0229 16.0 576 0.0303 0.9343 0.9610 0.9888 1e-06 0.9005 0.9867 0.9959 1e-06 0.8344 0.9801 0.9882
0.0217 17.0 612 0.0318 0.9290 0.9531 0.9882 1e-06 0.8743 0.9889 0.9962 1e-06 0.8197 0.9797 0.9877
0.021 18.0 648 0.0305 0.9314 0.9540 0.9886 1e-06 0.8756 0.9904 0.9961 1e-06 0.8256 0.9806 0.9879
0.0209 19.0 684 0.0296 0.9344 0.9637 0.9887 1e-06 0.9059 0.9915 0.9937 1e-06 0.8357 0.9795 0.9880
0.0199 20.0 720 0.0306 0.9335 0.9585 0.9888 1e-06 0.8896 0.9902 0.9955 1e-06 0.8319 0.9802 0.9883
0.0187 21.0 756 0.0305 0.9331 0.9576 0.9887 1e-06 0.8877 0.9891 0.9959 1e-06 0.8308 0.9803 0.9881
0.0184 22.0 792 0.0298 0.9353 0.9594 0.9891 1e-06 0.8926 0.9896 0.9959 1e-06 0.8364 0.9810 0.9884
0.0175 23.0 828 0.0301 0.9340 0.9576 0.9889 1e-06 0.8866 0.9906 0.9957 1e-06 0.8332 0.9806 0.9882
0.0174 24.0 864 0.0288 0.9360 0.9633 0.9891 1e-06 0.9062 0.9883 0.9953 1e-06 0.8389 0.9805 0.9885
0.0178 25.0 900 0.0306 0.9343 0.9603 0.9889 1e-06 0.8956 0.9902 0.9951 1e-06 0.8342 0.9804 0.9882
0.0165 26.0 936 0.0304 0.9349 0.9579 0.9891 1e-06 0.8884 0.9889 0.9963 1e-06 0.8353 0.9810 0.9883
0.016 27.0 972 0.0300 0.9352 0.9597 0.9891 1e-06 0.8934 0.9902 0.9956 1e-06 0.8362 0.9810 0.9884
0.0159 28.0 1008 0.0311 0.9343 0.9575 0.9890 1e-06 0.8872 0.9891 0.9962 1e-06 0.8340 0.9804 0.9884
0.0157 29.0 1044 0.0302 0.9362 0.9631 0.9891 1e-06 0.9050 0.9894 0.9951 1e-06 0.8389 0.9813 0.9883
0.015 30.0 1080 0.0312 0.9344 0.9601 0.9890 1e-06 0.8959 0.9887 0.9957 1e-06 0.8340 0.9809 0.9883
0.0162 31.0 1116 0.0334 0.9321 0.9558 0.9886 1e-06 0.8833 0.9876 0.9965 1e-06 0.8276 0.9807 0.9879
0.0144 32.0 1152 0.0312 0.9352 0.9610 0.9890 1e-06 0.8976 0.9900 0.9952 1e-06 0.8366 0.9805 0.9883
0.0147 33.0 1188 0.0299 0.9375 0.9607 0.9895 1e-06 0.8960 0.9902 0.9959 1e-06 0.8419 0.9817 0.9888
0.0144 34.0 1224 0.0323 0.9342 0.9592 0.9889 1e-06 0.8937 0.9879 0.9961 1e-06 0.8341 0.9802 0.9884
0.0144 35.0 1260 0.0303 0.9359 0.9608 0.9892 1e-06 0.8977 0.9890 0.9959 1e-06 0.8378 0.9812 0.9886
0.014 36.0 1296 0.0314 0.9359 0.9577 0.9892 1e-06 0.8878 0.9886 0.9967 1e-06 0.8378 0.9814 0.9885
0.0136 37.0 1332 0.0316 0.9365 0.9600 0.9893 1e-06 0.8954 0.9883 0.9963 1e-06 0.8397 0.9813 0.9885
0.0138 38.0 1368 0.0325 0.9352 0.9577 0.9891 1e-06 0.8869 0.9899 0.9962 1e-06 0.8361 0.9812 0.9884
0.0137 39.0 1404 0.0316 0.9363 0.9597 0.9893 1e-06 0.8933 0.9896 0.9960 1e-06 0.8391 0.9811 0.9886
0.0132 40.0 1440 0.0320 0.9353 0.9590 0.9891 1e-06 0.8930 0.9876 0.9965 1e-06 0.8368 0.9807 0.9884
0.0129 41.0 1476 0.0311 0.9368 0.9605 0.9893 1e-06 0.8961 0.9893 0.9960 1e-06 0.8401 0.9817 0.9886

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.1
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
13.7M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for selvaa/segformer-b1-finetuned-cityscapes-1024-1024-latestt

Finetuned
(7)
this model