File size: 9,157 Bytes
8b98d3d
ab64ec3
 
 
 
 
 
 
 
 
8b98d3d
 
ab64ec3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
license: other
base_model: nvidia/mit-b0
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-agriculture-freeze-encoder
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-agriculture-freeze-encoder

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4279
- Mean Iou: 0.2863
- Mean Accuracy: 0.3449
- Overall Accuracy: 0.3991
- Accuracy Unlabeled: nan
- Accuracy Nutrient Deficiency: 0.3911
- Accuracy Planter Skip: 0.2441
- Accuracy Water: 0.7100
- Accuracy Waterway: 0.1217
- Accuracy Weed Cluster: 0.2574
- Iou Unlabeled: 0.0
- Iou Nutrient Deficiency: 0.3885
- Iou Planter Skip: 0.2436
- Iou Water: 0.7074
- Iou Waterway: 0.1213
- Iou Weed Cluster: 0.2569

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step   | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Nutrient Deficiency | Accuracy Planter Skip | Accuracy Water | Accuracy Waterway | Accuracy Weed Cluster | Iou Unlabeled | Iou Nutrient Deficiency | Iou Planter Skip | Iou Water | Iou Waterway | Iou Weed Cluster |
|:-------------:|:-----:|:------:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------------------:|:---------------------:|:--------------:|:-----------------:|:---------------------:|:-------------:|:-----------------------:|:----------------:|:---------:|:------------:|:----------------:|
| 0.2023        | 1.0   | 8145   | 0.5192          | 0.1103   | 0.1327        | 0.1902           | nan                | 0.2070                       | 0.0003                | 0.3457         | 0.0000            | 0.1105                | 0.0           | 0.2057                  | 0.0003           | 0.3455    | 0.0000       | 0.1103           |
| 0.7172        | 2.0   | 16290  | 0.4974          | 0.1282   | 0.1543        | 0.2138           | nan                | 0.2617                       | 0.0332                | 0.3582         | 0.0098            | 0.1083                | 0.0           | 0.2601                  | 0.0332           | 0.3582    | 0.0098       | 0.1082           |
| 0.6844        | 3.0   | 24435  | 0.4657          | 0.2032   | 0.2445        | 0.3092           | nan                | 0.3512                       | 0.1223                | 0.5564         | 0.0384            | 0.1544                | 0.0           | 0.3492                  | 0.1220           | 0.5554    | 0.0382       | 0.1543           |
| 0.2052        | 4.0   | 32580  | 0.4671          | 0.1912   | 0.2299        | 0.2961           | nan                | 0.3261                       | 0.1340                | 0.4389         | 0.0347            | 0.2160                | 0.0           | 0.3245                  | 0.1338           | 0.4389    | 0.0346       | 0.2154           |
| 0.6564        | 5.0   | 40725  | 0.4468          | 0.2317   | 0.2788        | 0.3460           | nan                | 0.3663                       | 0.1487                | 0.5721         | 0.0793            | 0.2278                | 0.0           | 0.3642                  | 0.1485           | 0.5715    | 0.0790       | 0.2272           |
| 0.1997        | 6.0   | 48870  | 0.4483          | 0.2392   | 0.2879        | 0.3446           | nan                | 0.3524                       | 0.1821                | 0.6219         | 0.0772            | 0.2059                | 0.0           | 0.3501                  | 0.1817           | 0.6209    | 0.0769       | 0.2055           |
| 0.3586        | 7.0   | 57015  | 0.4413          | 0.2379   | 0.2860        | 0.3492           | nan                | 0.3676                       | 0.2073                | 0.5656         | 0.0505            | 0.2392                | 0.0           | 0.3661                  | 0.2069           | 0.5653    | 0.0504       | 0.2387           |
| 0.7879        | 8.0   | 65160  | 0.4369          | 0.2501   | 0.3008        | 0.3597           | nan                | 0.3632                       | 0.2320                | 0.6003         | 0.0585            | 0.2497                | 0.0           | 0.3618                  | 0.2315           | 0.6000    | 0.0584       | 0.2491           |
| 1.137         | 9.0   | 73305  | 0.4393          | 0.2649   | 0.3189        | 0.3853           | nan                | 0.4236                       | 0.2202                | 0.6417         | 0.0772            | 0.2316                | 0.0           | 0.4212                  | 0.2197           | 0.6405    | 0.0770       | 0.2312           |
| 0.2625        | 10.0  | 81450  | 0.4388          | 0.2540   | 0.3057        | 0.3850           | nan                | 0.4493                       | 0.2058                | 0.5276         | 0.0705            | 0.2755                | 0.0           | 0.4460                  | 0.2054           | 0.5275    | 0.0704       | 0.2748           |
| 0.2108        | 11.0  | 89595  | 0.4308          | 0.2845   | 0.3427        | 0.4149           | nan                | 0.4475                       | 0.2446                | 0.6482         | 0.0891            | 0.2838                | 0.0           | 0.4438                  | 0.2440           | 0.6472    | 0.0889       | 0.2830           |
| 0.3237        | 12.0  | 97740  | 0.4251          | 0.2858   | 0.3440        | 0.4225           | nan                | 0.4322                       | 0.2372                | 0.6314         | 0.0854            | 0.3336                | 0.0           | 0.4296                  | 0.2367           | 0.6309    | 0.0853       | 0.3322           |
| 1.0289        | 13.0  | 105885 | 0.4488          | 0.2604   | 0.3138        | 0.3823           | nan                | 0.4623                       | 0.2111                | 0.6343         | 0.0744            | 0.1869                | 0.0           | 0.4575                  | 0.2107           | 0.6332    | 0.0742       | 0.1867           |
| 0.6843        | 14.0  | 114030 | 0.4253          | 0.2922   | 0.3519        | 0.4252           | nan                | 0.4515                       | 0.2267                | 0.6901         | 0.1089            | 0.2824                | 0.0           | 0.4481                  | 0.2263           | 0.6884    | 0.1086       | 0.2815           |
| 0.2695        | 15.0  | 122175 | 0.4299          | 0.2856   | 0.3437        | 0.3878           | nan                | 0.3812                       | 0.2818                | 0.6873         | 0.1207            | 0.2472                | 0.0           | 0.3801                  | 0.2811           | 0.6858    | 0.1203       | 0.2466           |
| 0.3991        | 16.0  | 130320 | 0.4225          | 0.2938   | 0.3534        | 0.4137           | nan                | 0.4213                       | 0.2714                | 0.6712         | 0.1131            | 0.2898                | 0.0           | 0.4198                  | 0.2708           | 0.6702    | 0.1129       | 0.2888           |
| 0.7352        | 17.0  | 138465 | 0.4303          | 0.2732   | 0.3288        | 0.3894           | nan                | 0.4176                       | 0.2558                | 0.5941         | 0.1024            | 0.2740                | 0.0           | 0.4150                  | 0.2553           | 0.5939    | 0.1022       | 0.2731           |
| 0.6884        | 18.0  | 146610 | 0.4243          | 0.2956   | 0.3556        | 0.4135           | nan                | 0.4154                       | 0.2735                | 0.6575         | 0.1294            | 0.3024                | 0.0           | 0.4137                  | 0.2728           | 0.6568    | 0.1290       | 0.3013           |
| 0.3863        | 19.0  | 154755 | 0.4249          | 0.2861   | 0.3445        | 0.4184           | nan                | 0.4597                       | 0.2254                | 0.6370         | 0.1138            | 0.2864                | 0.0           | 0.4561                  | 0.2251           | 0.6365    | 0.1135       | 0.2858           |
| 0.3208        | 20.0  | 162900 | 0.4279          | 0.2863   | 0.3449        | 0.3991           | nan                | 0.3911                       | 0.2441                | 0.7100         | 0.1217            | 0.2574                | 0.0           | 0.3885                  | 0.2436           | 0.7074    | 0.1213       | 0.2569           |


### Framework versions

- Transformers 4.39.1
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2