Abstract
The perceptual organization of auditory scenes needs to parse the incoming flow of acoustic inputs into perceptual streams. It is likely that cues from several modalities are pooled for auditory scene analysis, including sensory-motor cues related to the active exploration of the scene. We show effects of source- and head-motion on auditory streaming. In a streaming paradigm, listeners hear a sequence of repeating tone triplets and indicate their perception of one or two subjective sources called streams. We used a robotic telepresence system, Telehead, to disentangle the effects of head motion: changes in acoustic cues at the ear, subjective location cues, and motor cues. We found that head motion induced perceptual reorganization even when the auditory scene had not changed. We further analyzed the data to probe the time course of sensory-motor integration. Motor cues impacted perceptual organization earlier and for a shorter time than acoustic or subjective location cues, with successive positive and negative contributions to streaming. An additional experiment showed that arm or leg movements did not have an impact on scene analysis. Our results suggest a loose temporal coupling between the different mechanisms involved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.