Abstract

To successfully negotiate a complex environment, an animal must control the timing of motor behaviors in coordination with dynamic sensory information. Here, we report on adaptive temporal control of vocal–motor behavior in an echolocating bat, Eptesicus fuscus, as it captured tethered insects close to background vegetation. Recordings of the bat's sonar vocalizations were synchronized with high-speed video images that were used to reconstruct the bat's three-dimensional flight path and the positions of target and vegetation. When the bat encountered the difficult task of taking insects as close as 10–20 cm from the vegetation, its behavior changed significantly from that under open room conditions. Its success rate decreased by about 50%, its time to initiate interception increased by a factor of ten, and its high repetition rate “terminal buzz” decreased in duration by a factor of three. Under all conditions, the bat produced prominent sonar “strobe groups,” clusters of echolocation pulses with stable intervals. In the final stages of insect capture, the bat produced strobe groups at a higher incidence when the insect was positioned near clutter. Strobe groups occurred at all phases of the wingbeat (and inferred respiration) cycle, challenging the hypothesis of strict synchronization between respiration and sound production in echolocating bats. The results of this study provide a clear demonstration of temporal vocal–motor control that directly impacts the signals used for perception.

Highlights

  • Echolocating bats rely on active sensing through acoustic channels and can orient in complete darkness

  • Note that the time axes differ across plots in this figure. (A) Shows the mean percentage of time the bats produced sonar strobe groups during the 1,000-ms time period before target contact

  • Data points plot the mean percentage time strobing at midpoints of 200-ms intervals

Read more

Summary

Introduction

Echolocating bats rely on active sensing through acoustic channels and can orient in complete darkness. They produce ultrasonic vocal signals and use information contained in the returning echoes to determine the direction and distance of objects in space (reviewed in [1]). With their biological sonar, bats can successfully forage and avoid obstacles by rapidly processing spatial information carried by echoes of their sonar broadcasts. The bat’s perceptual processes and motor control must operate in concert to enable auditory scene analysis and spatial orientation by sonar in a complex environment [2]. Like active vision, which involves the coordination of eye, head, and body movements with the processing and interpretation of retinal images (see [4,5,6]), echolocation gives rise to spatial perception from neural computations within and across both sensory and motor systems

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call