Abstract

Combination of audio and visual information is expected to ensure an efficient interface design for spatial information. Then, we focus on Audio-visual (AV) fusion referred to the perception of unity of audio and visual information despite there spatial disparity [1]. Previous experiment showed that AV fusion varied over space mainly with horizontal eccentricity [2]. As audio spatial information is coded in relation to head position and visual information is coded relative to eye position, question arises on eye position effect. The current psychophysical experiment investigates the effect of horizontal eye position shift on the variation of AV fusion over the 2D frontal space. Results showed that eye position affects AV fusion. Current data support the need for including eye position inputs when displaying redundant visual and auditory information in integrated multimodal interfaces. Results are discussed considering the probable effect of visual display structure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call