Abstract

This experiment examined spoken word recognition during and after perceptual learning of an unusual acoustic transformation of speech. Previous research has found that adults and children receive more benefit from cochlear implants when auditory information is effectively integrated with visual lipreading information. Because cochlear implant users have little, if any, preoperative experience with auditory information, these findings suggest that perceivers are able to capitalize on already-obtained lipreading skills in order to effectively learn to perceive new acoustic information about speech. This may be due to the fact that auditory and visual speech are lawfully related to the same physical event, a spoken utterance. Alternatively, it is possible that the addition of any relevant stimulus, regardless of its lawful relations with auditory speech, will aid in the process of learning how to deal with new perceptual information. To test these hypotheses, normal-hearing participants were trained with three different conditions of perceptual input while learning to perceive frequency-inverted speech: auditory-alone, auditory with orthographic visual stimulation, and auditory with visual information about the lips and face. The results of this study showed interesting patterns of perceptual learning and are discussed in terms of their implication for current theories of perceptual integration.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.