Abstract

The perceptual organization of auditory scenes needs to parse the incoming flow of acoustic inputs into perceptual streams. It is likely that cues from several modalities are pooled for auditory scene analysis, including sensory-motor cues related to the active exploration of the scene. We show effects of source- and head-motion on auditory streaming. In a streaming paradigm, listeners hear a sequence of repeating tone triplets and indicate their perception of one or two subjective sources called streams. We used a robotic telepresence system, Telehead, to disentangle the effects of head motion: changes in acoustic cues at the ear, subjective location cues, and motor cues. We found that head motion induced perceptual reorganization even when the auditory scene had not changed. We further analyzed the data to probe the time course of sensory-motor integration. Motor cues impacted perceptual organization earlier and for a shorter time than acoustic or subjective location cues, with successive positive and negative contributions to streaming. An additional experiment showed that arm or leg movements did not have an impact on scene analysis. Our results suggest a loose temporal coupling between the different mechanisms involved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call