Abstract

Stereo correspondence is a common tool in computer or robot vision, with numerous applications, such as determination of three-dimensional depth information of objects for virtual reality, autonomous vehicle and robot navigation, using a pair of left and right images from a stereo camera system. Computation time is an important factor in estimating dense disparity for the above applications. For of a pixel in the left image, its correspondence has to be searched in the right image based on epipolar line and maximum disparity search range. The intensity of a pixel alone in the left image does not have sufficient discriminatory power to determine its correspondence uniquely from the right image, thus other pixels in its neighborhood comprising a window is used for accurate estimation. In window-based approaches, this correspondence or disparity is conventionally determined based on matching windows of pixels by using sum of square differences, sum of absolute differences, or normalized correlation techniques. With a view to reduce the computation time, we propose a fast algorithm where it is not necessary to compute the window costs for all candidate pixels in the right image within the search range. To determine the correspondence of a pixel in the left image we just compute the window costs for candidate pixels in the right image whose intensities are different within a certain value to the intensity of the pixel in the left image. We applied our proposal to standard stereo images and found that we can easily reduce the computation time of about 30% with almost no degradation of accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.