Abstract

Recent advancements in Human-Robot Collaboration (HRC) have opened up promising prospects for revolutionizing the current manufacturing automation. Accurate modeling of human motion patterns is crucial for enabling the robot to understand human intention and predict their future motion based on online observations. As the widely used deterministic methods often lack confidence information about the provided result to account for the possible variability of human motion, in this work, a Probabilistic Dynamic Movement Primitives (PDMP)-based framework is adopted to recognize goal-directed human movements in the reaching phase and making online predictions. The proposed framework employs PDMP based on off-line demonstrations of relevant hand movements. To avoid frame-dependency, a novel DMP formulation with rotation and magnitude scaling is used, allowing for generalization of learned motion patterns to similar tasks. The proposed framework has been validated in experiments regarding an object transfer scenario on the workbench, using a Intel RealSense camera and OpenPose system for motion capturing. Results show that this framework can offer good performance in hand motion prediction in presence of human motion variations, and can be generalized to relevant tasks beyond the limited demonstrated trajectories.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call