Dataset Viewer
Full Screen Viewer
Full Screen
The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
Noisy-Labels-Instance-Segmentation
ReadMe:
Important! The original annotations should be in coco format.
To run the benchmark, run the following:
python noise_annotations.py /path/to/annotations --benchmark {easy, medium, hard} (choose the benchmark level) --seed 1
For example:
python noise_annotations.py /path/to/annotations --benchmark easy --seed 1
To run a custom noise method, run the following:
python noise_annotations.py /path/to/annotations --method_name method_name --corruption_values [{'rand': [scale_proportion, kernel_size(should be odd number)],'localization': [scale_proportion, std_dev], 'approximation': [scale_proportion, tolerance], 'flip_class': percent_class_noise}]}]
For example:
python noise_annotations.py /path/to/annotations --method_name my_noise_method --corruption_values [{'rand': [0.2, 3], 'localization': [0.2, 2], 'approximation': [0.2, 5], 'flip_class': 0.2}]
Citation
If you use this benchmark in your research, please cite this project.
@misc{grad2024benchmarkinglabelnoiseinstance,
title={Benchmarking Label Noise in Instance Segmentation: Spatial Noise Matters},
author={Eden Grad and Moshe Kimhi and Lion Halika and Chaim Baskin},
year={2024},
eprint={2406.10891},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2406.10891},
}
License
This project is released under the Apache 2.0 license.
Please make sure you use it with proper licenced Datasets.
We use MS-COCO/LVIS and Cityscapes
- Downloads last month
- 37