Abstract

In this paper, we propose an active learning approach applied to a music performance imitation scenario. The humanoid robot iCub listens to a human performance and then incrementally learns to use a virtual musical instrument in order to imitate the given sequence. This is achieved by first learning a model of the instrument, needed to locate where the required sounds are heard in a virtual keyboard layed out in a tactile interface. Then, a model of its body capabilities is also learnt, which serves to establish the likelihood of success of the actions needed to imitate the sequence of sounds and to correct the errors made by the underlying kinematic controller. It also uses self-evaluation stages to provide feedback to the human instructor, which can be used to guide its learning process.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call