Abstract

With the advancement of artificial intelligence technology and computer hardware, the stereo matching algorithm has been widely researched and applied in the field of image processing. In scenarios such as robot navigation and autonomous driving, stereo matching algorithms are used to assist robots in acquiring depth information about the surrounding environment, thereby improving the robot's ability for autonomous navigation during self-driving. In this paper, we address the issue of low matching accuracy of stereo matching algorithms in specular regions of images and propose a multi-attention-based stereo matching algorithm called MANet. The proposed algorithm embeds a multi-spectral attention module into the residual feature-extraction network of the PSMNet algorithm. It utilizes different 2D discrete cosine transforms to extract frequency-specific feature information, providing rich and effective features for cost computation in matching. The pyramid pooling module incorporates a coordinated attention mechanism, which not only maintains long-range dependencies with directional awareness but also captures more positional information during the pooling process, thereby enhancing the network's representational capacity. The MANet algorithm was evaluated on three major benchmark datasets, namely, SceneFlow, KITTI2015, and KITTI2012, and compared with relevant algorithms. Experimental results demonstrated that the MANet algorithm achieved higher accuracy in predicting disparities and exhibited stronger robustness against specular reflections, enabling more accurate disparity prediction in specular regions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.