Update README.md
Browse files
README.md
CHANGED
@@ -254,6 +254,8 @@ However the metrics for the Challenge were calculated on 12 classes and the TEST
|
|
254 |
#### Metrics
|
255 |
|
256 |
With the evaluation protocol, the **FLAIR-INC_RVBIE_resnet34_unet_15cl_norm** have been evaluated to **OA= 76.37%** and **mIoU=54.71%**.
|
|
|
|
|
257 |
The following table give the class-wise metrics :
|
258 |
|
259 |
| Modalities | IoU (%) | Fscore (%) | Precision (%) | Recall (%) |
|
@@ -271,18 +273,21 @@ The following table give the class-wise metrics :
|
|
271 |
| agricultural land | 52.01 | 68.43 | 59.18 | 81.12 |
|
272 |
| plowed land | 40.84 | 57.99 | 68.28 | 50.40 |
|
273 |
| swimming_pool | 48.44 | 65.27 | 81.62 | 54.37 |
|
274 |
-
|
|
275 |
| greenhouse | 39.45 | 56.57 | 45.52 | 74.72 |
|
276 |
| ----------------------- | ----------|---------|---------|---------|
|
277 |
-
| average |
|
278 |
|
279 |
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
280 |
|
281 |
{{ testing_metrics | default("[More Information Needed]", true)}}
|
282 |
|
|
|
|
|
|
|
283 |
|
284 |
|
285 |
-
| <div style="width:290px">
|
286 |
| --------------------------------------- | ------------------------------------- |
|
287 |
| `<img src="FLAIR-INC_RVBIE_resnet34_unet_15cl_norm_cm-precision.png" alt="drawing" style="width:300px;"/> |`<img src="FLAIR-INC_RVBIE_resnet34_unet_15cl_norm_cm-recall.png" alt="drawing" style="width:300px;"/> |
|
288 |
|
|
|
254 |
#### Metrics
|
255 |
|
256 |
With the evaluation protocol, the **FLAIR-INC_RVBIE_resnet34_unet_15cl_norm** have been evaluated to **OA= 76.37%** and **mIoU=54.71%**.
|
257 |
+
The _snow_ class is discarded from the average metrics.
|
258 |
+
|
259 |
The following table give the class-wise metrics :
|
260 |
|
261 |
| Modalities | IoU (%) | Fscore (%) | Precision (%) | Recall (%) |
|
|
|
273 |
| agricultural land | 52.01 | 68.43 | 59.18 | 81.12 |
|
274 |
| plowed land | 40.84 | 57.99 | 68.28 | 50.40 |
|
275 |
| swimming_pool | 48.44 | 65.27 | 81.62 | 54.37 |
|
276 |
+
| _snow_ | _00.00_ | _00.00_ | _00.00_ | _00.00_ |
|
277 |
| greenhouse | 39.45 | 56.57 | 45.52 | 74.72 |
|
278 |
| ----------------------- | ----------|---------|---------|---------|
|
279 |
+
| **average** | **58.63** | **72.44** | **74.3** | **72.49** |
|
280 |
|
281 |
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
282 |
|
283 |
{{ testing_metrics | default("[More Information Needed]", true)}}
|
284 |
|
285 |
+
The following illustration give the confusion matrix :
|
286 |
+
* Left : normalised acording to columns, columns sum at 100% and the **precision** is on the diagonal of the matrix
|
287 |
+
* Right : normalised acording to rows, rows sum at 100% and the **recall** is on the diagonal of the matrix
|
288 |
|
289 |
|
290 |
+
| <div style="width:290px">Normalised confusion Matrix (precision)</div> |<div style="width:290px">Normalised Confusion Matrix (recall)</div> |
|
291 |
| --------------------------------------- | ------------------------------------- |
|
292 |
| `<img src="FLAIR-INC_RVBIE_resnet34_unet_15cl_norm_cm-precision.png" alt="drawing" style="width:300px;"/> |`<img src="FLAIR-INC_RVBIE_resnet34_unet_15cl_norm_cm-recall.png" alt="drawing" style="width:300px;"/> |
|
293 |
|