Abstract
Inertial Navigation Systems (INS) play an increasing role in automotive electronics and aerospace applications, particularly in autonomous vehicles, due to their low computational load, swift response, and high autonomy. However, the substantial error accumulation poses a significant challenge for an INS, especially when employing the low-cost Inertial Measurement Units (IMUs). This study proposed R-AFNIO, a convolutional and attention-based deep learning network, which is developed to decrease the issues of error accumulation and the fusion of IMU array data. Firstly, we introduce a self-supervised learning model to learn the prior knowledge from IMU observation by masking redundant IMU data and reconstructing it, thereby reducing the noise of IMU data. Furthermore, we present an intelligent framework and employ an attention-based soft-weighting algorithm to mine the latent information within redundant IMUs. This approach effectively enhances fusion precision and strengthens robustness against error observations. Notably, it is the first approach that utilizes deep learning to solve the information fusion problem of redundant IMUs (IMU arrays). Lastly, we propose a state-augmented tight integration algorithm to improve the local accuracy and robustness of the navigation system. We comprehensively validate the proposed R-AFNIO using both a publicly available dataset and a dataset collected by our team. Experiment results demonstrated that R-AFNIO performs accurate and robust results on most indicators. Compared to several current studies, the absolute trajectory error shows reductions ranging from 20.2% to 97.7%, while the relative trajectory error exhibited reductions ranging from 18.5% to 98.7%. The ablation experiment further highlights the potency of R-AFNIO’s self-supervised and redundant weighting modules.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.