Abstract

Adults' ability to attain and retain nonnative speech sound categories vary substantially among individuals. While we know that speech-perceptual skills play a role, we know less about how consolidation-related changes in acoustic-phonetic memory contribute to perceptual tasks. The goal of this investigation was to examine contributions of memory and perceptual skills to the perceptual performance on a trained nonnative speech contrast over two days. Twenty-one adult participants (ages 18-24) completed four different experiments. Three of these assessed learning and memory: visual statistical learning (implicit), visual object recognition (explicit), and nonnative (Hindi dental-retroflex) speech-sound training. Participants completed the learning tasks around 8 p.m., and performance was measured shortly after learning and again 12 hours later. On a separate day, participants completed a categorical perception task on a native (/a/-/e/) vowel continuum. Nonnative speech perception was associated with implicit learning performance when both were assessed shortly after learning, and associated with the retention of explicit memory when both were assessed after an overnight delay. Native speech-sounds were at least marginally associated with nonnative speech perception performance on both days, but with a stronger association observed with performance assessed on Day 2. These findings provide preliminary support for the interpretation that speech-sounds are encoded by at least two memory systems in parallel, but that perceptual performance may reflect acoustic-phonetic knowledge learned by different memory systems over time since exposure. Moreover, performance on speech perception tasks in both native and nonnative speech-sounds may rely on similar retrieval mechanisms for long-term storage of speech-sound information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call