Abstract

How integration of information from various sensory systems occurs is one of the most difficult challenges in understanding human and robot perception and cognition. The problem of auditory-visual integration is defined as a correspondence problem. A motion based integration schema of auditory-visual information is proposed as a solution to the correspondence problem between perceived auditory and visual space. In this schema, the motion attracted from auditory and visual information is combined to generate a complete perception of the world. The results form psychophysical experiments using auditory localization tasks support the author's hypothesis. >

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call