Abstract

Most naturally occurring sounds are modulated in amplitude or frequency; important examples include animal vocalizations and species-specific communication signals in mammals, insects, reptiles, birds and amphibians. Deciphering the information from amplitude-modulated (AM) sounds is a well-understood process, requiring a phase locking of primary auditory afferents to the modulation envelopes. The mechanism for decoding frequency modulation (FM) is not as clear because the FM envelope is flat (Fig. 1). One biological solution is to monitor amplitude fluctuations in frequency-tuned cochlear filters as the instantaneous frequency of the FM sweeps through the passband of these filters. This view postulates an FM-to-AM transduction whereby a change in frequency is transmitted as a change in amplitude. This is an appealing idea because, if such transduction occurs early in the auditory pathway, it provides a neurally economical solution to how the auditory system encodes these important sounds. Here we illustrate that an FM and AM sound must be transformed into a common neural code in the brain stem. Observers can accurately determine if the phase of an FM presented to one ear is leading or lagging, by only a fraction of a millisecond, the phase of an AM presented to the other ear. A single intracranial image is perceived, the spatial position of which is a function of this phase difference.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.