Abstract

Virtual reality is becoming an important tool for studying the interaction between pedestrians and road vehicles, by allowing the analysis of potentially hazard situations without placing subjects in real risk. However, most of the current simulators are unable to accurately recreate traffic sounds that are congruent with the visual scene. This has been recognized as a fault in the virtual audio-visual scenarios used in such contexts. This study proposes a method for delivering a binaural auralization of the noise generated by a moving vehicle to an arbitrarily located moving listener (pedestrian). Building on previously developed methods, the proposal presented here integrates in a novel way a dynamic auralization engine, thus enabling real-time update of the acoustic cues in the binaural signal delivered via headphones. Furthermore, the proposed auralization routine uses Close ProXimity (CPX) tyre-road noise signal as sound source input, facilitating the quick interchangeability of source signals, and easing the noise collection procedure. Two validation experiments were carried out, one to quantitatively compare field signals with CPX-derived virtual signal recordings, and another to assess these same signals through psychoacoustic models. The latter aims to assure that the reproduction of the synthesized signal is perceptually similar to one occurring on pedestrian/vehicle interactions during situations of street crossing. Discrepancies were detected, and emphasized when the vehicle is within close distance from the receiver (pedestrian). However, the analysis indicated that these pose no hindrance to the study of vehicle–pedestrian interaction. Improvements to the method are identified and further developments are proposed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call