Abstract

Rather than systematically programming joint or task trajectories, having a human physically manipulate the robot for direct adjustments is more intuitive, saves time, and increases usability, especially for nonexperts. Interactive motion generation or repositioning of humanoid robots through direct human-touch manipulation is not an easy task, especially for high-level multijoint maneuvers. We propose a set of design rules for generating intuitive touch semantics called the “two-touch kinematic chain paradigm.” Our method interprets user touch intentions to allow motions ranging from low-level single joint control to high-level whole-body task control with posture generation, stepping, and walking. The goal is to provide the user with an intuitive protocol for physical humanoid manipulation that can serve the purpose of any application. The generated set of touch semantics is embodied in a finite state machine-based framework using a task-space quadratic programming controller to interpret human touch using capacitive sensors embedded in the humanoid shell, and force-torque sensors located at the ankles and wrists. A position-controlled humanoid robot is used to assess the utility and function of our proposed touch semantics for physical manipulation. Furthermore, a user study with nonexperts examines how our approach is perceived in practice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call