Abstract

An integrated computer graphics and audio signal processing system is developed for producing real-time audiovisual performances, such as a virtual concert. Animated musician models play MIDI-coded music, with hands and fingers synchronized to each note by kinematic calculations. Instrument sounds are synthesized by physical modeling techniques. Acoustic response of the concert hall is calculated based on its geometric model and material information. Ray tracing and image source methods are used for early reflections, whereas a recursive filter structure, consisting of comb and allpass filters, models the later diffuse reverberation. For three-dimensional sensation of the environment, the direct sound is auralized with a FIR approximation of the listener’s head-related transfer function (HRTF), and the image sources are processed with the interaural amplitude and time differences (IAD and ITD). The system implementation is distributed over a high-speed network using workstations for the visual and interactive parts and a dedicated digital signal processor for sound synthesis and HRTF convolution. The listener may freely move in the virtual concert hall using a mouse controller. Methods for conducting the virtual orchestra with a tracker baton are under development. [Work supported by the Academy of Finland.]

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call