Abstract

To avoid false colors, classical color sensors cut infrared wavelengths for which silicon is sensitive (with the use of an infrared cutoff filter called IR-cut). However, in low light situation, noise can alter images. To increase the amount of photons received by the sensor, in other words, the sensor's sensitivity, it has been proposed to remove the IR-cut for low light applications. In this paper, we analyze if this methodology is beneficial from a signal to noise ratio point of view when the wanted result is a color image. For this aim we recall the formalism behind physical raw image acquisition and color reconstruction. A comparative study is carried out between one classical color sensor and one specific color sensor designed for low light conditions. Simulated results have been computed for both sensors under same exposure settings and show that raw signal to noise ratio is better for the low light sensor. However, its reconstructed color image appears more noisy. Our formalism illustrates geometrically the reasons of this degradation in the case of the low light sensor. It is due on one hand to the higher correlation between spectral channels and on the other hand to the near infrared part of the signal in the raw data which is not intrinsically useful for color.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.