Abstract
One of the most challenging dilemmas for theories of speech perception is the lack of invariance between acoustic signal and perception. Due to physical constraints upon articulators, there is a good deal of context-dependency both in speech production and in the resulting acoustic speech signal. Consequently, the acoustic pattern most closely related to a given speech sound varies dramatically depending on context. Yet, by some means, the perceptual system perceives these unique acoustic events as linguistically equivalent. This phenomenon is observed experimentally as perceptual context effects whereby adjacent speech can modulate perceived identity of a given speech sound. Recent perceptual results suggest that this phenomenon may be governed by general auditory mechanisms [Holt et al., 1996; Lotto et al., 1997; Lotto and Kluender, 1998; Holt, 1999] rather than speech-specific processes. In the present study, we sought to explore how such context-dependencies might be encoded. We recorded responses of ventral cochlear nucleus (VCN) neurons of anesthetized chinchillas to nonspeech stimulus targets with adjacent context stimuli that varied in spectral content. Results demonstrate context-dependent effects of frequency and intensity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.