Abstract

This article develops a robot skill learning system with multi-space fusion, simultaneously considering motion/stiffness generation and trajectory tracking. To begin with, surface electromyography (sEMG) signals from the human arm is captured based on the MYO armband to estimate endpoint stiffness. Gaussian Process Regression (GPR) is combined with dynamic movement primitive (DMP) to extract more skills features from multi-demonstrations. Then, the traditional DMP formulation is improved based on the Riemannian metric to encode the robot’s quaternions with non-Euclidean properties. Furthermore, an adaptive neural network (NN)-based finite-time admittance controller is designed to track the trajectory generated by the motion model and to reflect the learned stiffness characteristics. In this controller, a radial basis function neural network (RBFNN) is employed to compensate for the uncertainty of the robot dynamics. Finally, experimental validation is conducted using the ROKAE collaborative robot, confirming the effectiveness of the proposed approach. In summary, the presented framework is suitable for human-robot skill transfer method that require simultaneous consideration of position and stiffness in Euclidean space, as well as orientation on Riemannian manifolds.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.