ObjectivesAuditory cortical N100s were examined in ten auditory neuropathy (AN) subjects as objective measures of impaired hearing. MethodsLatencies and amplitudes of N100 in AN to increases of frequency (4–50%) or intensity (4–8dB) of low (250Hz) or high (4000Hz) frequency tones were compared with results from normal-hearing controls. The sites of auditory nerve dysfunction were pre-synaptic (n=3) due to otoferlin mutations causing temperature sensitive deafness, post-synaptic (n=4) affecting other cranial and/or peripheral neuropathies, and undefined (n=3). ResultsAN consistently had N100s only to the largest changes of frequency or intensity whereas controls consistently had N100s to all but the smallest frequency and intensity changes. N100 latency in AN was significantly delayed compared to controls, more so for 250 than for 4000Hz and more so for changes of intensity compared to frequency. N100 amplitudes to frequency change were significantly reduced in ANs compared to controls, except for pre-synaptic AN in whom amplitudes were greater than controls. N100 latency to frequency change of 250 but not of 4000Hz was significantly related to speech perception scores. ConclusionsAs a group, AN subjects’ N100 potentials were abnormally delayed and smaller, particularly for low frequency. The extent of these abnormalities differed between pre- and post-synaptic forms of the disorder. SignificanceAbnormalities of auditory cortical N100 in AN reflect disorders of both temporal processing (low frequency) and neural adaptation (high frequency). Auditory N100 latency to the low frequency provides an objective measure of the degree of impaired speech perception in AN.
Read full abstract