Abstract

This paper is intended to solve the motor skills learning, representation and generalization problems in robot imitation learning. To this end, we present an Adapted Curvilinear Gaussian Mixture Model (AdC-GMM), which is a general extension of the GMM. The proposed model can encode data more compactly. More critically, it is inherently suitable for representing data with strong non-linearity. To infer the parameters of this model, a Cross Entropy Optimization (CEO) algorithm is proposed, where the cross entropy loss of the training data is minimized. Compared with the traditional Expectation Maximization (EM) algorithm, the CEO can automatically infer the optimal number of components. Finally, the generalized trajectories are retrieved by an Adapted Curvilinear Gaussian Mixture Regression (AdC-GMR) model. To encode observations from different frames, the sophisticated task parameterization (TP) technique is introduced. All above proposed algorithms are verified by comprehensive tasks. The CEO is evaluated by a hand writing task. Another goal-directed reaching task is used to evaluate the AdC-GMM and AdC-GMR algorithm. A novel hammer-over-a-nail task is designed to verify the task parameterization technique. Experimental results demonstrate the proposed CEO is superior to the EM in terms of encoding accuracy and the AdC-GMM can achieve more compact representation by reducing the number of components by up to 50%. In addition, the trajectory retrieved by the AdC-GMR is smoother and the approximation error is comparable to the Gaussian process regression (GPR) even far fewer parameters need to be estimated. Because of this, the AdC-GMR is much faster than the GPR. Finally, simulation experiments on the hammer-over-a-nail task demonstrates the proposed methods can be deployed and used in real-world applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call