Abstract

The research on rehabilitation robots is gradually moving toward combining human intention recognition with control strategies to stimulate user involvement. In order to enhance the interactive performance between the robot and the human body, we propose a machine-learning-based human motion intention recognition algorithm using sensor information such as force, displacement and wheel speed. The proposed system uses the bi-directional long short-term memory (BILSTM) algorithm to recognize actions such as falling, walking, and turning, of which the accuracy rate has reached 99.61%. In addition, a radial basis function neural network adaptive sliding mode controller (RBFNNASMC) is proposed to track and control the patient’s behavioral intention and the gait of the lower limb exoskeleton and to adjust the weights of the RBF network using the adaptive law. This can achieve a dynamic estimation of the human–robot interaction forces and external disturbances, and it gives the exoskeleton joint motor a suitable driving torque. The stability of the controller is demonstrated using the Lyapunov stability theory. Finally, the experimental results demonstrate that the BILSTM classifier has more accurate recognition than the conventional classifier, and the real-time performance can meet the demand of the control cycle. Meanwhile, the RBFNNASMC controller has a better gait tracking effect compared with the PID controller.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call