Abstract

Simultaneous localization and mapping (SLAM) systems play an important role in the field of automated robotics and artificial intelligence. Feature detection and matching are crucial aspects affecting the overall accuracy of the SLAM system. However, the accuracy of the position and matching cannot be guaranteed when confronted with a cross-view angle, illumination, texture, etc. Moreover, deep learning methods are very sensitive to perspective change and do not have the invariance of geometric transformation. Therefore, a novel pseudo-Siamese convolutional network of a transformation invariance feature detection and a description for the SLAM system is proposed in this paper. The proposed method, by learning transformation invariance features and descriptors, simultaneously improves the front-end landmark detection and tracking module of the SLAM system. We converted the input image to the transform field; the backbone network was designed to extract feature maps. Then, the feature detection subnetwork and feature description subnetwork were decomposed and designed; finally, we constructed a convolutional network of transformation invariance feature detections and a description for the visual SLAM system. We implemented many experiments in datasets, and the results of the experiments demonstrated that our method has a state-of-the-art performance in global tracking when compared to that of the traditional visual SLAM systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.