Abstract

This article presents ISEE, an intuitive sound editing environment, as a general sound synthesis model based on expert auditory perception and cognition of musical instruments. It discusses the backgrounds of current synthesizer user interface design and related timbre space research. Of the three principal parameters of sound—pitch, loudness and timbre—ISEE focuses on control of timbre, and affects only the range of pitch and loudness. Timbre is manipulated using four abstract timbre parameters: overtones, brightness, articulation and envelope. These abstract timbre parameters are implemented in different ways for different instruments. They define instrument spaces of which a hierarchy can be built to structure refinement of timbre parameter behavior. An Apple Macintosh implementation of ISEE is described. ISEE has four main advantages over traditional sound synthesis editors. Firstly, it allows musicians to control sound synthesis as they control their musical instrument: by continuous movement, reducing cognitive control load. Secondly, it uses timbre parameters identified by human experience instead of indirect and intricate synthesis model parameters. Thirdly, it integrates a librarian system in the sound synthesis model. Finally, it enables transparent use of several synthesis models at a time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call