Abstract

Utilizing perception for feedback control in combination with dynamic movement primitive (DMP)-based motion generation for a robot’s end-effector control is a useful solution for many robotic manufacturing tasks. For instance, while performing an insertion task when the hole or the recipient part is not visible in the eye-in-hand camera, a learning-based movement primitive method can be used to generate the end-effector path. Once the recipient part is in the field of view (FOV), image-based visual servo (IBVS) can be used to control the motion of the robot. Inspired by such applications, this article presents a generalized control scheme that switches between motion generation using DMP and IBVS control. To facilitate the design, a common state-space representation for the DMP and the IBVS systems is first established. The stability analysis of the switched system using multiple Lyapunov functions shows that the state trajectories converge to a bound asymptotically. The developed method is validated by three real-world experiments using the eye-in-hand configuration of a Baxter research robot.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call