Abstract

Illumination estimation is the key routine in a camera's onboard auto-white-balance (AWB) function. Illumination estimation algorithms estimate the color of the scene's illumination from an image in the form of an R, G, B vector in the sensor's raw-RGB color space. While learning-based methods have demonstrated impressive performance for illumination estimation, cameras still rely on simple statistical-based algorithms that are less accurate but capable of executing quickly on the camera's hardware. An effective strategy to improve the accuracy of these fast statistical-based algorithms is to apply a post-estimate bias-correction function to transform the estimated R, G, B vector such that it lies closer to the correct solution. Recent work by Finlayson [Interface Focus8, 20180008 (2018)2042-889810.1098/rsfs.2018.0008] showed that a bias-correction function can be formulated as a projective transform because the magnitude of the R, G, B illumination vector does not matter to the AWB procedure. This paper builds on this finding and shows that further improvements can be obtained by using an as-projective-as-possible (APAP) projective transform that locally adapts the projective transform to the input R, G, B vector. We demonstrate the effectiveness of the proposed APAP bias correction on several well-known statistical illumination estimation methods. We also describe a fast lookup method that allows the APAP transform to be performed with only a few lookup operations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call