Abstract

Depth information can be obtained using stereo matching algorithms, which compute the horizontal displacement (disparity) of the corresponding points and convert to depth information using the triangular relation. However, the matching process is challenging with presence of textureless regions. This paper proposes a novel disparity refinement method for stereo matching based on Semi-global Matching (SGM) algorithm for texture less images. To be brief, the SGM algorithm is a high-performance matching algorithm of a stereo image pair and it reaches a trade off between matching quality and computing complexity. The main contribution comes from the improvement of matching quality on texture less regions. At the end of SGM, a right-to-left consistency check is performed to remove the invalid pixels in the disparity map. The proposed method is added after the right-to-left consistent check. We assume that the texture less regions in the original stereo pairs are planar. We employ edge detection to extract the textureless regions. The fitting method is employed in horizontal and vertical directions respectively. If the distance of two adjacent edge pixels is sufficiently large, the corresponding line segment will be considered as texturelss. For every texturelss line segment, the RANSAC algorithm is performed on the corresponding line segment in the disparity map. We use the well-known Middlebury dataset to compare our method with the normal SGM and other matching algorithms. It shows that our method performs well for most texture less stereo pairs.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.