Abstract
Most current algorithm evaluation protocols use large image databases, but give little consideration to imaging characteristics used to create the data sets. This paper evaluates the effects of camera shutter speed and voltage gain under simultaneous changes in illumination and demonstrates significant differences in the sensitivities of popular vision algorithms under variable illumination, shutter speed, and gain. These results show that offline data sets used to evaluate vision algorithms typically suffer from a significant sensor specific bias which can make many of the experimental methodologies used to evaluate vision algorithms unable to provide results that generalize in less controlled environments. We show that for typical indoor scenes, the different saturation levels of the color filters are easily reached, leading to the occurrence of localized saturation which is not exclusively based on the scene radiance but on the spectral density of individual colors present in the scene. Even under constant illumination, foreshortening effects due to surface orientation can affect feature detection and saliency. Finally, we demonstrate that active and purposive control of the shutter speed and gain can lead to significantly more reliable feature detection under varying illumination and nonconstant viewpoints.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Pattern Analysis and Machine Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.