Abstract

Aiming at the problem that the pedestrian navigation system cannot effectively perform positioning and navigation when foot-mounted IMU exceeds its measurement ranges during vigorous activities or collisions with obstacles, a novel pedestrian navigation method is proposed based on construction of adapted virtual inertial measurement unit (VIMU) assisted by gait type classification. The method utilizes attention-based convolutional neural network (CNN) to classify gait types of the pedestrian equipped with the navigation system. Then, with the inertial data from pedestrian's thigh and foot collected synchronously via actual IMUs as training and testing samples, correspondent Resnet-Gated Recurrent Unit (Resnet-GRU) deep hybrid neural network models are built according to diverse gait types. Through these models, virtual foot-mounted IMU is constructed for positioning in case of physical foot-mounted IMU overrange. The experimental results show that, the proposed VIMU construction method can effectively improve the performance and reliability of zero velocity update-based pedestrian navigation system when the pedestrian moves or hits some obstacles with foot vigorously, thus enhancing the adaptability of the navigation system in complex and unknown terrains. The positioning error under comprehensive gait types is about 1.45% of the total walking distance, which meets the accuracy requirement in military and civilian applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.