Abstract

The design of real-time digital musical instruments, both novel as well as the emulation of traditional musical performance practices using computer systems, has always required nearly imperceptible latencies between controlling gesture and expressed sound. Sound and music based experiences built within immersive visual rendered spaces often struggle to provide seamless articulate control of rendered dynamic output. Working across and within extended reality environments, the complexity of the challenge continues to evolve with current generations of hardware and software pushing core and graphics processors to their respective limits. The introduction of networked performance spaces has only added to the complexity, resulting in the need for additional latency mitigation strategies from both the technical and artistic domains. This paper will discuss recent creative and technical strategies and concerns in the design, development and use of real-time virtual musical instruments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call