A human gait recognition algorithm with deep learning based on multi-source data fusion is proposed to assist exoskeleton realizing complex human-exoskeleton cooperative motion. A lightweight gait acquisition device simultaneously acquires three different types of sensor signals, i.e., thigh surface electromyography (sEMG), ground reaction force (GRF) of human feet and two joints motion information. The sample data is obtained by many multi-scene gait experiments involving stand still, walk up/down stairs, and walk up/down slope, etc. The proposed algorithm recognizes multiple gait phases (27 classes) in different motion patterns by using short-time gait data, which has two typical advantages: (1) Based on multi-source sensor distributed in human body, this wearable device is constructed for ease of use to address several Subjects (different weights and heights) with low cost and light weight. (2) The critical features of gait data are identified by Bi-directional Long Short-Term Memory (BILSTM) attention neural network with higher recognition accuracy than other state-of-art recognition methods, especially in arbitrary gait switching durations.
Read full abstract