Abstract

Teaching robots manipulative skills through human demonstration is an important research problem and can be used to quickly program robots in future manufacturing industries. To understand human demonstration, manipulative actions need to be recognized. To improve the recognition performance, we use three kinds of sensors to capture the motion and force involved in the fine manipulative actions. In addition, by taking advantage of the action/object correlation, the recognition accuracy can be further improved. In the proposed approach, important features for individual actions are selected first. Hidden Markov Models (HMMs) are employed to characterize the temporal changes. Then, a Bayesian model is adopted to model the object/action dependency. Our approach was evaluated through experiments on assembly tasks. The experimental results show that the proposed approach can recognize manipulative actions effectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call