Abstract
Soundscape design is a growing concern in architectural practice; yet, conveying soundscape properties is a challenge. Quantitative measures like decibel levels offer little utility; birdsong and squealing brakes may be at the same frequency and level, yet one is obviously preferable. Furthermore, visual qualities of the scene have a substantial effect on perception of the acoustic environment. To improve communication between acoustic designers, architects, and clients, an augmented reality interface has been designed to allow comparisons of soundscapes and their relationships to the built environment. The interface consists of a physical scale model, printed data, and a tablet application. The augmented reality application uses a machine vision algorithm to recognize the model and allow on-screen interaction. On the tablet, measurement points are displayed on a live image of the model at their respective locations, and the user may select a point by touch, which will begin sound playback and display an immersive image of the scene. The participant can then look around a spherical image of the scene using the tablet as a movable window, and thereby listen to the environment with appropriate accompanying visual cues. The interface has been a useful tool to communicate urban sound issues.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.