Abstract

From John Cage’s Prepared Piano to the turntable, the history of musical instruments is scattered with examples of musicians who deeply customised their instruments to fit personal artistic objectives, objectives that differed from the ones the instruments have been designed for. In their digital counterpart however, musical instruments are often presented in the form of closed, finalised systems with apriori symbolic rules set by their designer that leave very little room for the artists to customise the technologies for their unique art practices; in these cases the only possibility to change the mode of interaction with digital instrument is to reprogram them, a possibility available to programmers but not to musicians. This thesis presents two digital music instruments designed with the explicit goal of being highly customisable by musicians and to provide different modes of interactions, whilst keeping simplicity and immediateness of use. The first one leverages real-time gesture recognition to provide continuous feedback to users as guidance in defining the behaviour of the system and the gestures it recognises. The second one is a novel tangible user interface which allows to transform everyday objects into expressive digital music instruments, and whose sound generated strongly depends by the particular nature of the physical object selected.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call