Edit model card

segformer-b0-scene-parse-nb-test

This model is a fine-tuned version of nvidia/mit-b0 on the scene_parse_150 dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2785
  • Mean Iou: 0.0483
  • Mean Accuracy: 0.1013
  • Overall Accuracy: 0.4880
  • Per Category Iou: [0.46276462034550236, 0.532477560882453, 0.6744570143774657, 0.19919054911397943, 0.46464817442538875, 0.05646932895219959, 0.48488421437810175, 0.0, 0.0, 0.0, 0.0, 0.002821795941061996, 0.0, 0.29877168545017097, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.10829741379310345, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0012044843880292782, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
  • Per Category Accuracy: [0.8456098091660934, 0.8940305027091834, 0.9956332917456356, 0.28567693353553314, 0.8669273825602569, 0.05881819124662933, 0.9942061606435747, 0.0, 0.0, 0.0, 0.0, 0.0030775761434787222, 0.0, 0.3047323012278496, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.7257955314827352, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0012044843880292782, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
4.4391 6.67 20 4.8268 0.0166 0.0702 0.2862 [0.04511565207528073, 0.32173753309949954, 0.3542289158707404, 0.011587057010785825, 0.5380298021414122, 8.706251088281386e-05, 0.35161063627730293, 0.0, 0.0005894567620068323, 0.0, 0.0, 0.004444681304625879, 0.0, 0.0, 0.0, 0.0, nan, 0.003198857928409031, 0.0, 0.0, 0.35006321112515804, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.015718466699978586, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan] [0.04545684314169357, 0.8627471026143851, 0.9766521585919866, 0.014746640171521205, 0.5861672747797689, 9.723086496577473e-05, 0.964543995665041, 0.0, 0.0012779923901362223, 0.0, 0.0, 0.006321917496702597, 0.0, 0.0, 0.0, 0.0, nan, 0.04388828436706565, nan, 0.0, 0.6249153689911984, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.016982092452917496, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
3.9332 13.33 40 4.2239 0.0222 0.0817 0.3217 [0.11718025497481298, 0.36580544589687863, 0.32232164651573814, 0.025732891928387857, 0.5102480003111347, 8.301351843219392e-05, 0.3802769965209815, 0.0, 0.0012168410805548796, 0.0, 0.0, 0.015113756682075253, 6.418960622145414e-05, 4.3052230966608686e-05, 0.0, 0.0, nan, 0.019542994588093807, nan, 0.0, 0.17309976247030878, 0.0, 0.002006528282157442, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.001424545064640647, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan] [0.12857937200414063, 0.8909123860330679, 0.9934999380946485, 0.030892119437326778, 0.7934162517386659, 8.426674963700477e-05, 0.9869690717352341, 0.0, 0.0013070376717302275, 0.0, 0.0, 0.017745334364245538, 0.018156424581005588, 4.3052230966608686e-05, 0.0, 0.0, nan, 0.16503445774392456, nan, 0.0, 0.7894380501015572, nan, 0.003997747747747748, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.001434454675859516, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
3.7699 20.0 60 3.7570 0.0299 0.0895 0.3923 [0.3613359948478115, 0.3768078793988305, 0.47805166688072215, 0.046723891921273755, 0.4477649320658679, 0.01224347868524438, 0.37244798185231537, 0.0, 0.0, 0.0, 0.0, 0.003387592036187143, 0.0, 0.02318201699843052, 0.0, 0.0, nan, 0.020343065455847412, nan, 0.0, 0.12602266756736472, 0.0, 0.007079081632653061, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan] [0.47104085028010806, 0.9153972694405977, 0.9925904056229107, 0.053940281336610364, 0.8169142795707595, 0.01300948973242066, 0.9923252469676128, 0.0, 0.0, 0.0, 0.0, 0.003888661481784691, 0.0, 0.023274036060548658, 0.0, 0.0, nan, 0.22669568371418208, nan, 0.0, 0.757842473482284, nan, 0.0125, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
3.6842 26.67 80 3.4760 0.0401 0.0939 0.4455 [0.45950560225110626, 0.3952427526309612, 0.5946363825678155, 0.10273090586145649, 0.47838464909987977, 0.029731781376518218, 0.4218895358472638, 0.0, 0.0, 0.0, 0.0, 0.002741826498258745, 0.0, 0.17335282751073033, 0.0, 0.0, nan, 0.00978264867051598, nan, 0.0, 0.14086948588111856, 0.0, 0.0001242004595417003, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] [0.692672356006444, 0.9389481119362064, 0.9926189773235936, 0.12097997176175286, 0.799410029498525, 0.030465671022609418, 0.9947688716602059, 0.0, 0.0, 0.0, 0.0, 0.0031988599323842875, 0.0, 0.17457679656959824, 0.0, 0.0, nan, 0.0964816829887559, nan, 0.0, 0.696908147145114, nan, 0.00016891891891891893, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
3.5807 33.33 100 3.4120 0.0465 0.0982 0.4779 [0.454970646211352, 0.5302442876151289, 0.7421392815495321, 0.1835473023723408, 0.44808093762742346, 0.033047391365490085, 0.49701283943522157, 0.0, 0.0, 0.0, 0.0, 0.015521383950024026, 0.0, 0.17792980317765514, 0.0, 0.0, nan, 0.0005598755832037325, nan, 0.0, 0.1263951621934376, 0.0, 0.00013279918549832895, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] [0.8410581908251155, 0.9018420587934608, 0.9940237526071677, 0.20765570255713015, 0.8639304936803274, 0.03351223812487036, 0.991304030678171, 0.0, 0.0, 0.0, 0.0, 0.017138915419717713, 0.0, 0.17848593914136632, 0.0, 0.0, nan, 0.003264417845484222, nan, 0.0, 0.7641615888061386, nan, 0.00016891891891891893, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
3.4999 40.0 120 3.2752 0.0445 0.0983 0.4740 [0.4645335442428095, 0.5457862104681473, 0.689778542787802, 0.2097859510402939, 0.41824392712550607, 0.046293221730059585, 0.4744853555441952, 0.0, 0.0, 0.0, 0.0, 0.005170185264971995, 0.0, 0.1557485050170614, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.10617156499016034, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.000617455466024513, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] [0.8158165990114351, 0.8766276011412958, 0.996128534557472, 0.2747084662448361, 0.8885305165265655, 0.047895924082140634, 0.9913665541244634, 0.0, 0.0, 0.0, 0.0, 0.005639696184108791, 0.0, 0.15877662780485285, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.7427217332430602, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0006176843015534761, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
3.3419 46.67 140 3.2785 0.0483 0.1013 0.4880 [0.46276462034550236, 0.532477560882453, 0.6744570143774657, 0.19919054911397943, 0.46464817442538875, 0.05646932895219959, 0.48488421437810175, 0.0, 0.0, 0.0, 0.0, 0.002821795941061996, 0.0, 0.29877168545017097, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.10829741379310345, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0012044843880292782, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] [0.8456098091660934, 0.8940305027091834, 0.9956332917456356, 0.28567693353553314, 0.8669273825602569, 0.05881819124662933, 0.9942061606435747, 0.0, 0.0, 0.0, 0.0, 0.0030775761434787222, 0.0, 0.3047323012278496, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.7257955314827352, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0012044843880292782, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1
  • Datasets 2.14.3
  • Tokenizers 0.13.3
Downloads last month
1
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for rnibhanu/segformer-b0-scene-parse-nb-test

Base model

nvidia/mit-b0
Finetuned
(300)
this model

Dataset used to train rnibhanu/segformer-b0-scene-parse-nb-test