Abstract

This paper presents a novel approach toward synthesizing whole-body motions from visual perception and reaction force for a humanoid robot that maintains a suitable physical interaction with an environment. A behavior containing a whole-body motion, reaction force, and visual perception is encoded into a probabilistic model referred to as a “motion symbol”. The humanoid robot selects a motion symbol appropriate to the current situation and computes references for joint angles and reaction forces according to the selected symbol. The robot subsequently modifies these references to satisfy a desired impedance relating the robot whole-body positions and forces. This computation builds visual and physical feedback loops with knowledge about the behaviors, making it possible for a humanoid robot to not only perform human-like motion behaviors similar to training behaviors, but to also physically adapt to the immediate environment. We applies this proposed framework only to controlling the upper-body motion for a humanoid robot. Experiments demonstrate that the proposed method allows a humanoid robot to control its upper-body motion in response to visual perception and reaction forces acting on its hands to achieve five tasks while controlling its lower-body motion for its balance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call