The challenge on
Object Detection and Instance Segmentation in the Dark

Evaluation Metric

In the challenge, we use Average Precision (AP) as the evaluation metric,which is same with the coco competition.

Metric Description
  Average Precision (AP):
  AP% AP at IoU=.50:.05:.95 (primary challenge metric)
  APIoU=.50% AP at IoU=.50 (PASCAL VOC metric)
  APIoU=.75% AP at IoU=.75 (strict metric)
  AP Across Scales:
  APsmall% AP for small objects: area < 322
  APmedium% AP for medium objects: 322 < area < 962
  APlarge% AP for large objects: area > 962
  Average Recall (AR):
  ARmax=1% AR given 1 detection per image
  ARmax=10% AR given 10 detections per image
  ARmax=100% AR given 100 detections per image
  AR Across Scales:
  ARsmall% AR for small objects: area < 322
  ARmedium% AR for medium objects: 322 < area < 962
  ARlarge% AR for large objects: area > 962

Format of Submission

To submit your result, you are required to submit a .json file. WE follow the format of COCO.

Format of Filenames
After processing the input images of the validation set, you should save the results as:

lis_coco_png_raw_dark_valonly_challenge.json

After processing the input images of the testing set, you should save the results as:

lis_coco_png_raw_dark_testonly_challenge.json

You should directly save the JSON file corresponding to the results you want to submit in a folder. There should be no other folders within the path. Before submission, compress the folder into a zip file.

Submission

https://codalab.lisn.upsaclay.fr/competitions/17833