Abstract

We propose an unsupervised motion segmentation method for enabling a robot to imitate and perform various unit motions by observing unsegmented human motion. Natural unsegmented human motion data contain various types of unit motions, such as 'waving good-bye', 'walking' and 'throwing a ball'. A robot has to segment the data and extract unit motions from the data to imitate the motions. In previous work, an ergodic hidden Markov model (HMM) was used to model unsegmented human motion. However, there are two main problems with the classical use of this model. (i) Setting an appropriate number of hidden states is difficult because how complex the motions contained in the learning data are and how many there are unknown. (ii) We did not have an effective chunking method that could chunk elemental motions into meaningful unit motions without being captured by local minima. To overcome these problems, we developed an unsupervised motion segmentation method for imitation learning using a sticky hierarchical Dirichlet process (HDP)-HMM, a nonparametric Bayesian model, and an unsupervised chunking method based on a Gibbs sampler and the minimal description length (MDL) principle of imitation learning of unsegmented human motion. We developed this chunking method to work with the sticky HDP-HMM and extract unit human motions. We conducted several experiments to evaluate this method. The proposed method could extract unit motions from unsegmented human motion data. The sticky HDP-HMM can be used to model unsegmented human motion more accurately than with a conventional HMM and simultaneously estimate the number of hidden states. We also evaluated the dependency of the HDP-HMM on the hyperparameters of the model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call