Abstract

The present study investigated whether manual tactile information from a speaker's face modulates the intelligibility of speech when audio-tactile perception is compared with audio-only perception. Since more elaborated auditory and tactile skills have been reported in the blind, two groups of congenitally blind and sighted adults were compared. Participants performed a forced-choice syllable decision task across three conditions: audio-only and congruent/incongruent audio-tactile conditions. For the auditory modality, the syllables were embedded or not in noise while, for the tactile modality, participants felt in synchrony a mouthed syllable by placing a hand on the face of a talker. In the absence of acoustic noise, syllables were almost perfectly recognized in all conditions. On the contrary, with syllables embedded with acoustic noise, more correct responses were reported in case of congruent mouthing compared to no mouthing, and in case of no mouthing compared to incongruent mouthing. Interestingly, no perceptual differences were observed between blind and sighted adults. These findings demonstrate that manual tactile information relevant to recovering speech gestures modulates auditory speech perception in case of degraded acoustic information and that audio-tactile interactions occur similarly in blind and sighted untrained listeners.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.