Abstract

Computer-based interactive music generation systems have an advantage over acoustic instruments with respect to the flexibility of the human interface for their control. Whereas human performers must conform to the relatively invariant constraints of acoustic instruments, a computer interface can be rapidly updated with regard to changes in user preferences and performance requirements. This paper will describe a software environment that executes miniature psychophysical scaling experiments for a single user of an interactive music generation system in order to derive individualized control over music synthesis via inversion of the obtained psychophysical scaling results. A case study will be presented in which the system has been used to control parametric synthesis of musical timbres using an electronic musical keyboard, providing automatic perceptual mapping of synthesis parameters within their musically useful range. More in-depth exploration of the timbral similarities between a user-selected set of synthesis patches resulted in low-dimensional control structures that could be used to organize musical timbres in preparation for composition or performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.