Abstract

Over the last 10 years, high-frequency bursts of action potentials have been the subject of intense researches to understand their potential role in information encoding. Based on recordings from auditory thalamus neurons (n = 302) collected during anesthesia (pentobarbital, urethan, or ketamine/xylazine), waking (W), and slow-wave sleep (SWS), we investigated how bursts participate to frequency tuning, intensity-function, response latency (and latency variability), and stimulus detectability. Although present in all experimental conditions, bursts never dominated the cells mode of discharge: the highest proportion was found during ketamine/xylazine anesthesia (22%), the lowest during waking (4.5%). In all experimental conditions, bursts preferentially occurred at or around the cells best frequency (BF), thus increasing the frequency selectivity. This effect was observed at both the intensities producing the highest and the lowest evoked responses. Testing the intensity-functions indicated that for most of the cells, there was no systematic relationship between burst proportion and responses strength. Under several conditions (W, SWS, and urethan), when cells exhibited bursts >20%, the variability of their response latency was reduced in burst mode compared with single-spike mode. During W, this effect was accompanied by a reduction of the response latency. Finally, a receiver operating characteristic analysis indicated no particular relation between bursts and stimulus detectability. Compared with single-spike mode, which is present for broader frequency ranges, the prominence of bursts at the BF should contribute to filter information reaching the targets of medial geniculate cells at both cortical and subcortical levels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call