Abstract

This paper presents an inertial measurement unit-based human gesture recognition system for a robot instrument player to understand the instructions dictated by an orchestra conductor and accordingly adapt its musical performance. It is an extension of our previous publications on natural human–robot musical interaction. With this system, the robot can understand the real-time variations in musical parameters dictated by the conductor’s movements, adding expression to its performance while being synchronized with all the other human partner musicians. The enhanced interaction ability would obviously lead to an improvement of the overall live performance, but also allow the partner musicians, as well as the conductor, to better appreciate a joint musical performance, thanks to the complete naturalness of the interaction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call