Abstract

Audio-only augmented reality consists of enhancing a real environment with virtual sound events. A seamless integration of the virtual events within the environment requires processing them with artificial spatialization and reverberation effects that simulate the acoustic properties of the room. However, in augmented reality, the visual and acoustic environment of the listener may not be fully mastered. This study aims to gain some insight into the acoustic cues (intensity and reverberation) that are used by the listener to form an auditory distance judgment, and to observe if these strategies can be influenced by the listener’s environment. To do so, we present a perceptual evaluation of two distance-rendering models informed by a measured Spatial Room Impulse Response. The choice of the rendering methods was made to design stimuli categories in which the availability and reproduction quality of acoustic cues are different. The proposed models have been evaluated in an online experiment gathering 108 participants who were asked to provide judgments of auditory distance about a stationary source. To evaluate the importance of environmental cues, participants had to describe the environment in which they were running the experiment, and more specifically the volume of the room and the distance to the wall they were facing. It could be shown that these context cues had a limited, but significant, influence on the perceived auditory distance.

Highlights

  • Audio-only augmented reality (AAR) consists of using spatial audio processing to enhance the real environment of a user with virtual sound events

  • The method has a direct influence on the reproduction of acoustic cues conveyed by the room effect, possibly affecting the auditory distance perception of a virtual sound source

  • In AAR scenarios, the characteristics of the environment are not necessarily mastered; a room divergence effect [2] may occur between the real listening environment and the acoustic cues conveyed by the virtual sound source processing

Read more

Summary

Introduction

Audio-only augmented reality (AAR) consists of using spatial audio processing to enhance the real environment of a user with virtual sound events. A key requirement of AAR is to produce the localization of virtual sound sources in a real environment as naturally as possible. Auditory distance perception is one component of the spatial localization of virtual sound sources and depends on a combination of several acoustic and nonacoustic distance cues [1]. The production of a virtual sound source in a real environment calls for the choice of a spatial audio processing method. The method has a direct influence on the reproduction of acoustic cues conveyed by the room effect, possibly affecting the auditory distance perception of a virtual sound source. In AAR scenarios, the characteristics of the environment (geometry, room acoustics) are not necessarily mastered; a room divergence effect [2] may occur between the real listening environment and the acoustic cues conveyed by the virtual sound source processing

Objectives
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call