Abstract

Auditory perception is shaped by spectral properties of surrounding sounds. For example, when spectral properties differ between earlier (context) and later (target) sounds, this can produce spectral contrast effects (SCEs; i.e., categorization boundary shifts) that bias perception of later sounds. SCEs influence perception of speech and nonspeech sounds alike. When categorizing vowels or consonants, SCE magnitudes increased linearly with greater spectral differences between contexts and target speech sounds [Stilp et al. (2015) JASA; Stilp & Alexander (2016) POMA; Stilp & Assgari (2017) JASA]. Here, we tested whether this linear relationship between context spectra and SCEs generalizes to nonspeech categorization. Listeners categorized musical instrument targets that varied from French horn to tenor saxophone. Before each target, listeners heard a one-second music sample processed by spectral envelope difference filters that amplified / attenuated frequencies to reflect the difference between horn and saxophone spectra. By varying filter gain, filters reflected part of (25%, 50%, 75%) or the full (100%) difference between instrument spectra. As filter gains increased to reflect more of the difference between instrument spectra, SCE magnitudes increased linearly, parallel to speech categorization. Thus, a close relationship between context spectra and biases in target categorization appears to be fundamental to auditory perception.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call