Abstract
In this paper, a regularized stereo matching scheme using adaptive disparity estimation is proposed. That is, by adaptively predicting the mutual correlation between stereo image pair using the proposed algorithm, the bandwidth of an input stereo image pair can be compressed to the level of a conventional two-dimensional (2D) image and a predicted image can also be effectively reconstructed using a reference image and disparity vectors. In the adaptive disparity estimation method, feature values are extracted from an input stereo image pair and a matching window for stereo matching is adaptively selected depending on the magnitude of the feature values. Accordingly, this adaptive window matching algorithm (AMA) could improve the overall performance of stereo matching by reducing the mismatch of disparity vectors, which occurs in the conventional dense disparity estimation with a small matching window, and also by reducing a blocking effect, which occurs in the coarse disparity estimation with a large matching window. However, it still has problems of overlap and disallocation of matching windows. Therefore, to alleviate those problems, a new regularized adaptive disparity estimation method is proposed in this paper. That is, the estimated disparity vector is regularized with the neighboring disparity vectors; as a result, the predicted stereo image is found to be more effectively reconstructed. Some experiments with stereo sequences of “Man”, “Hoon”, “Yong”, and “Car” reveal that the proposed algorithm improves the peak signal-to-noise ratios (PSNRs) of the reconstructed images by 8.47 and 1.53 dB, on average, compared with those in the cases of feature and pixel-based window matching algorithm (FPMA) and AMA, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.