We present an integrated digital twin for room-oriented immersive systems (ROIS) that facilitates spatially distributed multi-user remote telepresence. This work, built upon prior development of game-based virtual collocation systems in immersive rooms, further addresses synchronous communication and spatially distributed interactivity between remote participants through a distributed audio spatialization system. This digital twin allows its users in physically disjoint locations to navigate through the virtual worlds with virtual sound source information transmitted through remote procedure calls (RPC). Precisioned user location tracking data is broadcast directly through in-game sessions. For each virtual sound source, generated procedurally or through non-invasive microphone inputs, real-time reverberation is synthesized and encoded based on proximity-modulatedperceptual parameters. These virtual sound sources are rendered in each ROIS facility through Open Sound Control (OSC). This approach offloads the otherwise expensive computational resources for physically based acoustic rendering techniques unsuited for networked applications. We demonstrate this by evaluating the system using latency metrics informed by round-trip-time measures. Taken as a whole, the integrated digital twin in this work also considers broader scalability so that researchers and content creators can deploy it for any immersive rooms alike in the future. [Work supported by NSF IIS-1909229 & CNS-1229391.]
Read full abstract