Two notes sounded sequentially elicit melodic intervals and contours that form the basis of melody. Many previous studies have characterized pitch perception in cochlear implant (CI) users to be poor which may be due to the limited spectro-temporal resolution and/or spectral warping with electric hearing compared to acoustic hearing (AH). Poor pitch perception in CIs has been shown to distort melodic interval perception. To characterize this interval distortion, we recruited CI users with either normal (single sided deafness, SSD) or limited (bimodal) AH in the non-implanted ear. The contralateral AH allowed for a stable reference with which to compare melodic interval perception in the CI ear, within the same listener.Melodic interval perception was compared across acoustic and electric hearing in 9 CI listeners (4 bimodal and 5 SSD). Participants were asked to rank the size of a probe interval presented to the CI ear to a reference interval presented to the contralateral AH ear using a method of constant stimuli. Ipsilateral interval ranking was also measured within the AH ear to ensure that listeners understood the task and that interval ranking was stable and accurate within AH. Stimuli were delivered to the AH ear via headphones and to the CI ear via direct audio input (DAI) to participants’ clinical processors. During testing, a reference and probe interval was presented and participants indicated which was larger. Ten comparisons for each reference-probe combination were presented. Psychometric functions were fit to the data to determine the probe interval size that matched the reference interval.Across all AH reference intervals, the mean matched CI interval was 1.74 times larger than the AH reference. However, there was great inter-subject variability. For some participants, CI interval distortion varied across different reference AH intervals; for others, CI interval distortion was constant. Within the AH ear, ipsilateral interval ranking was accurate, ensuring that participants understood the task. No significant differences in the patterns of results were observed between bimodal and SSD CI users.The present data show that much larger intervals were needed with the CI to match contralateral AH reference intervals. As such, input melodic patterns are likely to be perceived as frequency compressed and/or warped with electric hearing, with less variation among notes in the pattern. The high inter-subject variability in CI interval distortion suggests that CI signal processing should be optimized for individual CI users.
Read full abstract