Abstract

Biosonar mechanisms are highlighted by comparing acoustically reconstructed scenes derived from generalized methods with the performance of echolocating animals. Bioinspired computations replacing standard methods then offer a path towards understanding the animal’s solutions. Seemingly simple outdoor spaces present flying bats with complex scenes when vegetation and the ground are factored in, and even sound-treated research spaces such as flight rooms evolve into complex scenes as transmitted sounds propagate through the space. Underwater scenes, particularly for shallow water, are much more complicated because longer propagation distances create multipath reverberation that often overlaps with echoes. Interpulse intervals of biosonar emissions must be short to support rapid updating of tracking and perception, but reverberation means that pulse-echo ambiguity occurs. Two reconstructive methods are useful—the HARPEX method for visualizing the progression of echoes and reverberation following a sound, and a HARPEX-like method reconfigured as a forward-looking sonar. Acoustic datasets visualize airborne and underwater sonar scenes by reconstructing sound images for successive time frames following a transmitted sound using a tetrahedral soundfield microphone or a forward-looking sonar head. The complex acoustics of vegetation are examined with a simulation model built to understand the bats’ perception of vegetation. (Work supported by ONR.)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call