This project aimed to investigate the correlation between virtual reality (VR) imagery and ambisonic sound. With the increasing popularity of VR applications, understanding how sound is perceived in virtual environments is crucial for enhancing the immersiveness of the experience. In the experiment, participants were immersed in a virtual environment that replicated a concert hall. Their task was to assess the correspondence between sound scenes (which differed in reverberation times and their characteristics) and the observed invariant visual scene. The research was conducted using paired tests. Participants were asked to identify the sound scene they considered more closely matched the concert hall seen in the VR goggles for each pair. Each sound scene differed in the employed impulse response. All the impulse responses were recorded in real venues such as concert halls, auditoriums, churches, etc. To provide a realistic auditory experience, the sound scenes were processed using third-order ambisonics and decoded using binaural techniques with HRTFs. The virtual concert hall was generated using the Unreal Engine and was the same for all the tests. One of the major conclusions drawn from the conducted research was confirming the role of spatial sound in creating immersive VR experiences. The study demonstrated that appropriately matching spatial sound to the VR visual scene is essential for achieving complete immersion. Additionally, expectations and preferences regarding reverberation characteristics in different types of spaces were discovered. These findings have significant implications for the design of virtual environments, and understanding these aspects can contribute to improving VR technology and creating more immersive and realistic virtual experiences for users.
Read full abstract