We present a novel and simple experimental method called physical human interactive guidance to study human-planned grasping. Instead of studying how the human uses his/her own biological hand or how a human teleoperates a robot hand in a grasping task, the method involves a human interacting physically with a robot arm and hand, carefully moving and guiding the robot into the grasping pose, while the robot's configuration is recorded. Analysis of the grasps from this simple method has produced two interesting results. First, the grasps produced by this method perform better than grasps generated through a state-of-the-art automated grasp planner. Second, this method when combined with a detailed statistical analysis using a variety of grasp measures (physics-based heuristics considered critical for a good grasp) offered insights into how the human grasping method is similar or different from automated grasping synthesis techniques. Specifically, data from the physical human interactive guidance method showed that the human-planned grasping method provides grasps that are similar to grasps from a state-of-the-art automated grasp planner, but differed in one key aspect. The robot wrists were aligned with the object's principal axes in the human-planned grasps (termed low skewness in this paper), while the automated grasps used arbitrary wrist orientation. Preliminary tests show that grasps with low skewness were significantly more robust than grasps with high skewness (77-93%). We conclude with a detailed discussion of how the physical human interactive guidance method relates to existing methods to extract the human principles for physical interaction.
Read full abstract