Abstract

Android robots for arts performances are required to perform human-like actions. Recently, the most popular method is to reconstruct the action of the robot by video-recording the choreographer's action directed by a performance director in a studio with 3D motion capture equipment. In this paper, we propose a method to create robot motion data based on video-recorded human demonstration. The proposed method aims at the following two objectives: 1) the robot motion can be made through the existing performance video clips demonstrated by a human actor, and 2) the partial corrections for the robot motion can be made instantaneously when the corrections are given by performance director. To achieve those two objectives, we first adopted OpenPose to extract the coordinates of human joints from a video clip of human performances. Then, the extracted coordinates were converted into the joint angle values. Next, the joint angle values were renovated to generate robot motions satisfying the kinematics and hardware specifications of the robot. Finally, the efficacy of the proposed method is examined by applying the generated robot motions to a female android robot.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call