Abstract

How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a novel paradigm in which participants compared the loudness of two consecutive sounds whose intensity changes around the just noticeable difference (JND), with manual gestures concurrently presented with the second sound. In two behavioral experiments and two EEG experiments, we investigated our hypothesis that the visual-motor information in gestures would modulate loudness perception. Behavioral results showed that the gestural information biased the judgment of loudness. More importantly, the EEG results demonstrated that early auditory responses around 100 ms after sound onset (N100) were modulated by the gestures. These consistent results in four behavioral and EEG experiments suggest that visual-motor processing can integrate with auditory processing at an early perceptual stage to shape the perception of a low-level perceptual attribute such as loudness, at least under challenging listening conditions.

Highlights

  • Imagine that you are boasting about the size of the fish you caught last weekend to your friend

  • Our results suggested that certain visual-motion information in manual gestures modulated the early auditory neural responses that corresponded to changes in loudness perception at the just-noticeable difference (JND) threshold

  • EEG Results If the gestures CLOSER and AWAY modulated loudness perception rather than decisional processes, the modulation effects should be observed in event-related potential (ERP) at early latencies (e.g., ∼100 ms) rather than at late latencies

Read more

Summary

Introduction

Imagine that you are boasting about the size of the fish you caught last weekend to your friend. You would probably raise your voice volume when you say the word “big,” and at the same time move your hands away from each other. The iconic gestures in this example represent the size of the fish visually and parallel the volume of your voice. Suppose that two utterances have the same intensity; if a gesture accompanies one but not the other sound, would you perceive one sound as quieter or louder than the other sound? Whether and how the informational contents in one modality penetrate the processing in another modality is a fundamental question for understanding the nature of human perception. Suppose that two utterances have the same intensity; if a gesture accompanies one but not the other sound, would you perceive one sound as quieter or louder than the other sound? In general, whether and how the informational contents in one modality penetrate the processing in another modality is a fundamental question for understanding the nature of human perception.

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call