Abstract

This paper studies a new sensorial approach to improving communication between partners during collaborative tasks taking place in abstract and non-visual virtual reality environments. The sensorial approach was investigated in the context of the search and identification of targets in a simplified 2D environment. It consists in finding a spatial configuration corresponding to a defined criterion such as a maximum, minimum, or defined score. During the collaborative search, users need to be aware of their results and also the results of their partners. In addition, they need to compare examined scores (e.g., docking score or physical value) with other results to make decisions. To support these features, an audio---haptic display was developed employing binaural audio with an intermodal stimuli synthesis design to improve the collaborative search. This rendering tool allows for simultaneous use of the audio and haptic channels which enables an efficient individual search and comparison of results. In addition, it improves the communication and activity coordination between the partners. An experiment was carried out to evaluate the contribution of the tool to improve the collaborative search of targets in a 2D non-visual environment. The results clearly show a significant improvement in performance and working efficiency with the audio---haptic display as compared to a traditional haptic-only condition. Moreover, we observed a reduction in the need for verbal communication during some steps of the search process. However, this approach introduces some communication conflicts during the steps presenting high-level interactions between partners which reduce the working efficiency of some groups.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call