Abstract

Difficulties in perception of consonants have been associated with speech perception difficulties in hearing-impaired (HI) listeners. However, the neural bases underlying such difficulties have not been clearly explored and understood. The goal of this study was to use scalp electroencephalography (EEG) recordings to better understand the neural mechanisms that contribute to the consonant perception difficulties in HI listeners. Such a psychophysiological approach can lead to the development of improved hearing-aid fitting and speech enhancement strategies. Perceptual and EEG responses to vowel-consonant-vowel speech were measured in 8 HI listeners with and without amplification. A machine-learning classifier was trained to discriminate the EEG signal evoked by each consonant. The performance of the classifier was compared to the HI listeners’ psychophysical performance. For all subjects, consonant intelligibility was better in aided compared to unaided listening condition, but overall performance was well below ceiling. An information transmission analysis showed that place and manner of articulation were more affected than voicing and nasality. EEG waveform showed different response patterns for each consonant, and the machine-learning classifier was able to successfully “decode” consonants from the EEG signal. However, a straightforward relationship between the neural and perceptual representation of consonants could not be established in HI listeners.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call