Abstract

Musical activity is multifarious. From instrument making to instrument playing and compositional conception, machines and man/machine relations are quite different. It is nevertheless exclusively in the instrument playing, more generally in instrument experiments that the man/machine relation corresponds to and needs the true real‐time situation as defined in computer context. So, instrumental experience is a fundamental reference to conceive the basic functions of systems in real‐time digital synthesis. Accordingly, algorithmic models appear as intermediary between instrumentist gesture and musical sound. Then their functions are: (1) to permit and guide the gestual action, (2) to pick up the pertinent information from it, and (3) to create an acoustical signal combining this temporal information with an atemporal or structural one. The latter may be considered as instrument definition or representation. By its mechanical nature, the gesture implies a mechanical modelization, at least at a first step. We shall present a digital sound synthesis system, the CORDIS system, entirely founded on a mechanical modelization of musical instruments. The latter are analyzed and reconstructed from elementary mechanical components. One of our goals is to give a way of understanding certain elementary components of musical languages by correlating them with concrete elementary experiments of acoustical source objects.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.