Abstract

The navigation system of autonomous mobile robots has appeared challenging when using exteroceptive sensors such as cameras, LiDARs, and radars in textureless and structureless environments. This paper presents a robust state estimation system for holonomic mobile robots using intrinsic sensors based on adaptive factor graph optimization in the degradation scenarios. In particular, the neural networks are employed to learn the observation and noise model using only IMU sensor and wheel encoder data. Investigating the learning model for the holonomic mobile robot is discussed with various neural network architectures. We also explore the neural networks that are far more powerful and have cheaper computing power when using the inertial-wheel encoder sensors. Furthermore, we employ an industrial holonomic robot platform equipped with multiple LiDARs, cameras, IMU, and wheel encoders to conduct the experiments and create the ground truth without a bulky motion capture system. The collected datasets are then implemented to train the neural networks. Finally, the experimental evaluation presents that our solution provides better accuracy and real-time performance than other solutions. <i>Note to Practitioners</i>&#x2014;Autonomous mobile robots need to serve in challenging environments robustly that deny extrinsic sensors such as cameras, LiDARs, and radars. In order to operate in this situation, the navigation system shall rely on the intrinsic sensor as inertial sensor and wheel encoders. Existing conventional methods have combined the intrinsic sensors in the form of the recursive Bayesian filtering technique without adapting these models. Besides, deep learning-based solutions have adopted extensive networks like LSTM or CNN to handle the estimation problem. This work aims to develop a state estimation subsystem of the navigation system for the holonomic mobile robots that utilize the intrinsic sensors in adaptive factor graph optimization. In particular, we present how to join the factor graph efficiently with the learning observation model of IMU and wheel encoder factor. Moreover, the neural networks are introduced to learn the observation model with an IMU and wheel encoder data inputs. We recognize that lightweight neural networks can achieve more power than deep learning techniques using the IMU sensor and wheel encoders. Finally, the neural networks are embedded in a factor graph to handle the smoothing state estimation. The proposed system could operate with high accuracy in real time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.