Abstract

Understanding perceptual learning effects under novel acoustic circumstances, e.g., situations of hearing loss or cochlear implantation, constitutes a critical goal for research in the hearing sciences and for basic perceptual research surrounding spoken language use. These effects have primarily been studied in traditional laboratory settings using stationary subjects, pre-recorded materials, and a restricted set of potential subject responses. In the present series of experiments, we extended this paradigm to investigate perceptual learning in a situated, interactive, real-world context for spoken language use. Experiments 1 and 2 compared the learning achieved by normal-hearing subjects experiencing real-time cochlear implant acoustic simulation in either conversation or traditional feedback-based computer training. In experiment 1, we found that interactive conversational subjects achieved perceptual learning equal to that of laboratory-trained subjects for speech recognition in the quiet, but neither group generalized this learning to other domains. Experiment 2 replicated the learning findings for speech recognition in quiet and further demonstrated that subjects given active perceptual exposure were able to transfer their perceptual learning to a novel task, gaining significantly more benefit from the availability of semantic context in an isolated word recognition task than subjects who completed conventional laboratory-based training.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.