Abstract
Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker׳s face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio–visual and audio–haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio–visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio–haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have