Abstract

Information representation in augmented and virtual reality systems, and social physical (building) spaces can enhance the efficacy of interacting with and assimilating abstract, non-visual data. Sonification is the process of automatically generated real time information representation. There is a gap in our implementation and knowledge of auditory display systems used to enhance interaction in virtual and augmented reality. This paper addresses that gap by examining methodologies for mapping socio-spatial data to spatialised sonification manipulated with gestural controllers. This is a system of interactive knowledge representation that completes the human integration loop, enabling the user to interact with and manipulate data using 3D spatial gesture and 3D auditory display. Benefits include 1) added immersion in an augmented or virtual reality interface; 2) auditory display avoids visual overload in visually-saturated processes such as designing, evacuation in emergencies, flying aircraft; computer gaming; and 3) bi-modal or auditory representation, due to its time-based character, facilitates cognition of complex information.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.