Abstract
Motion pattern classification is one of the important research fields in lower extremity exoskeleton robot, it refers to acquiring motion data from multiple sensors installed on the exoskeleton. We designed a wearable lower limb exoskeleton robot with multiple sensors mainly including force sensitive resistors (FSRs) inside smart shoes and encoders inside joints. The wearable robot was used to help people carry the heavy load in the scenes of ascending stairs and descending stairs. The experiments of stair walking were carried out by the subjects who wore the exoskeleton to ascend stairs and descend stairs for a designated time. Before or after the stair walking, the subject would turn to move on flat ground with the result that there existed four transition motions between the stair and flat ground walking. As known, there is less research focusing on the classification of transition motions. The aim of this paper is to classify these motion patterns through a learning algorithm. The convolutional neural network (CNN) and gated recurrent unit (GRU) framework were combined to improve the classification accuracy. Specifically, CNN was used to extract the features of the motion pattern, while GRU was used to extract the temporal correlation during walking. Experimental works showed that the proposed CNN-GRU possessed a significantly high prediction accuracy in motion pattern classification. Compared with CNN, GRU and LSTM-CNN models whose accuracy score does not exceed 93.22%, the proposed CNN-GRU gained a high accuracy of 95.51%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.