Abstract

AbstractIllumination estimation is the essential step of computational color constancy, one of the core parts of various image processing pipelines of modern digital cameras. Having an accurate and reliable illumination estimation is important for reducing the illumination influence on the image colors. To motivate the generation of new ideas and the development of new algorithms in this field, two challenges on illumination estimation were conducted. The main advantage of testing a method on a challenge over testing it on some of the known datasets is the fact that the ground‐truth illuminations for the challenge test images are unknown up until the results have been submitted, which prevents any potential hyperparameter tuning that may be biased. The First illumination estimation challenge (IEC#1) had only a single task, global illumination estimation. The second illumination estimation challenge (IEC#2) was enriched with two additional tracks that encompassed indoor and two‐illuminant illumination estimation. Other main features of it are a new large dataset of images (about 5000) taken with the same camera sensor model, a manual markup accompanying each image, diverse content with scenes taken in numerous countries under a huge variety of illuminations extracted by using the SpyderCube calibration object, and a contest‐like markup for the images from the Cube++ dataset. This article focuses on the description of the past two challenges, algorithms which won in each track, and the conclusions that were drawn based on the results obtained during the first and second challenge that can be useful for similar future developments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.