Abstract

Lifelong autonomous operation has gained much attention in the field of mobile robotics in recent years. In the context of robot navigation based on vision, lifelong applications include scenarios with substantial perceptual changes due to changes in season, illumination and weather. In this paper, we present an approach to localize a mobile robot, equipped with a low frequency camera, with respect to an image sequence recorded in a different season. Our approach employs a discrete Bayes filter with a sensor model based on whole image descriptors. We compute a similarity matrix over all image descriptors and leverage the sequential nature of typical image streams with a flexible transition scheme in the Bayes filter framework. Since we compute a probability distribution over the entire state space, our approach can handle complex trajectories that may include same season loop-closures as well as fragmented sub-sequences. Furthermore, we show that decorrelating the similarity matrix results in an improved localization performance. Through an extensive experimental evaluation on challenging datasets we demonstrate that our approach outperforms state-of-the-art techniques.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.