Abstract

Stereo matching algorithms have been developed for many years but basically focus only on the implementation of existing datasets and are rarely applied to real scenarios, such as industrial robot scenarios. Traditional stereo matching algorithms have a high error rate, and deep learning algorithms are difficult to obtain good results in real scenarios because of their weak generalisation ability and difficult access to training data. In order to use stereo matching algorithms for industrial robot guidance, it is better to design a new traditional algorithm with low time complexity for the characteristics of industrial robot scenarios dominated by planar facets. This paper proposes a new matching method based on subrows of pixels, instead of individual pixels, in order to improve robustness of matching and reduce running time. First, the pixel strings from the same row of the left and right images are divided into several colour-identical or colour-gradient segments. Then, the colour and length of the two left and right pixel segment are used as clues to determine a matching relation and obtain the matching type. Then, all match types can be determined according to non-crossing mapping. Each match type can reason backward to the corresponding spatial state of the stimulus source so that the disparity of pixels in pixel segments representing the spatial state can be calculated. This new matching method makes full use of the stimulus homology constraints and projective geometric constraints of row-aligned images. The method can obtain good results in industrial robot scenarios and be applied for industrial robot guidance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.