The auralization of dynamic soundscapes brings significant computational challenges, as multiple often moving sources have to be considered simultaneously. Making such auralizations interactive rules out pre-rendering, requiring the underlying computations to be performed in real-time instead. The “Virtual Acoustics” (VA) framework allows for such real-time applications by leveraging geometrical acoustics and efficient multi-threading. Delivering these simulations to a larger audience, however, has proven difficult, as they still require a lengthy setup and possible simulation complexity is highly dependent on available hardware. When also including high-fidelity 3D visualization, hardware requirements rise further, hindering remote deployment. We present a web-based approach that executes simulations on a server and streams the result in real-time, offloading heavy calculations to a sufficiently powerful machine. 3D graphics are rendered by Unreal Engine and streamed using Pixel Streaming. VA auralization is transmitted via WebRTC audio streams. Both visualization and auralization are ultimately aggregated in a single website for platform-independent access. User input is gathered by the website and relaid back to the server, making the experience interactive. Thereby, this approach can deliver high-fidelity virtual environments with minimal setup and hardware requirements for the user. The approach will be demonstrated by streaming an interactive city park soundscape.
Read full abstract