Abstract

Establishing robust and effective data correlation has been one of the core problems in visual based SLAM (Simultaneous Localization and Mapping). In this paper, we propose a geometric correspondence estimation network, GCENet, tailored for visual tracking and loop detection in visual-inertial SLAM. GCENet considers both local and global correlation in frames, enabling deep feature matching in scenarios involving noticeable displacement. Building upon this, we introduce a tightly-coupled visual-inertial state estimation system. To address challenges in extreme environments, such as strong illumination and weak texture, where manual feature matching tends to fail, a compensatory deep optical flow tracker is incorporated into our system. In such cases, our approach utilizes GCENet for dense optical flow tracking, replacing manual pipelines to conduct visual tracking. Furthermore, a deep loop detector based on GCENet is constructed, which utilizes estimated flow to represent scene similarity. Spatial consistency discrimination on candidate loops is conducted with GCENet to establish long-term data association, effectively suppressing false negatives and false positives in loop closure. Dedicated experiments are conducted in EuRoC drone, TUM-4Seasons and private robot datasets to evaluate the proposed method. The results demonstrate that our system exhibits superior robustness and accuracy in extreme environments compared to the state-of-the-art methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.