Abstract

Abstract We describe an interface designed and developed to allow an automatic synthesis system to be controlled by a human conductor. The aim is to endow that electronic music with the expressiveness and feeling which are characteristic of live performances. The system recognises those movements and gestures made by a conductor during a live performance which conform with international standards. The system consists of a special conductor's baton and a personal computer with an image acquisition board connected to a camera. The movements of a small lamp placed on the tip of the baton are detected by the camera, digitised as light point position changes and stored. A best‐fitting interpolating method is used to perform the corresponding path, increase time resolution and predict the beat points. The data acquired by the system, i.e., beat point prediction, order number of the beats, amplitude of gesture, end of movement and absence of light points, are applied as input parameters of an algorithmic compositional environment. The system can be used to synchronise human and machine performances, to learn conducting movements, and to control algorithmic compositions in real time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call