Abstract

When the wearer of Wearable Human-Machine Interaction System (WHMIS) is talking a strenuous walk, the Inertial Measurement Units (IMUs) installed on the extremities are likely to be beyond the range of design specification of the IMU sensors. This overrange leads to poor navigation performances of the system. To overcome the problem, this paper proposes a deep learning-based method to realize robust autonomous navigation. Specifically, the Faster Region-Convolutional Neural Network (R-CNN) is built to recognize the gait types of the wearer. For each gait type, the inertial data of the wearer's thigh and the foot are collected at the same frequency as the training samples, and then used to construct a correspondent Gated Recurrent Unit-Support Vector Regression (GRU-SVR) deep hybrid neural network model for virtual IMU simulation. Then, the virtual foot-mounted IMU will be constructed in real-time by the physical thigh-mounted IMU based on the neural network models mentioned above. Once the over-range of the physical foot-mounted IMU is detected, the virtual foot inertial information will be used to reconstruct the WHMIS navigation system. Experimental results show that the proposed navigation method has the ability of fault tolerance in complex gaits, and the 3D positioning performance of the reconstructed navigation system is approximately equivalent to that of the fault-free WHMIS navigation system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call