Abstract

This paper presents a framework called Music via Motion (MvM) designed for the transdomain mapping between physical movements of the performer(s) and multimedia events, translating activities from one creative domain to another-for example, from physical gesture to audio output. With a brief background of this domain and prototype designs, the paper describes a number of inter- and multidisciplinary collaborative works for interactive multimedia performances. These include a virtual musical instrument interface, exploring video-based tracking technology to provide an intuitive and nonintrusive musical interface, and sensor-based augmented instrument designs. The paper also describes a distributed multimedia-mapping server which allows multiplatform and multisensory integrations and presents a sample application which integrates a real-time face tracking system. Ongoing developments and plausible future explorations on stage augmentation with virtual and augmented realities as well as gesture analysis on the correlations of musical gesture and physical gesture are also discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call