Datasets:

Languages:
English
ArXiv:
License:
COCO-N / README.md
eden500's picture
Upload 3 files
3221cd8 verified
|
raw
history blame
1.78 kB

Noisy-Labels-Instance-Segmentation

This is the official repo for the paper A Benchmark for Learning with Noisy Labels in Instance Segmentation

paper meme

ReadMe:

Important! The original annotations should be in coco format.

To run the benchmark, run the following:

python noise_annotations.py /path/to/annotations --benchmark {easy, medium, hard} (choose the benchmark level) --seed 1

For example:

python noise_annotations.py /path/to/annotations --benchmark easy --seed 1

To run a custom noise method, run the following:

python noise_annotations.py /path/to/annotations --method_name method_name --corruption_values [{'rand': [scale_proportion, kernel_size(should be odd number)],'localization': [scale_proportion, std_dev], 'approximation': [scale_proportion, tolerance], 'flip_class': percent_class_noise}]}]

For example:

 python noise_annotations.py /path/to/annotations --method_name my_noise_method --corruption_values [{'rand': [0.2, 3], 'localization': [0.2, 2], 'approximation': [0.2, 5], 'flip_class': 0.2}]

Citation

If you use this benchmark in your research, please cite this project.

Bibtex will be avalible shortly

License

This project is released under the Apache 2.0 license.

Please make sure you use it with proper licenced Datasets.

We use MS-COCO/LVIS and Cityscapes

image