Abstract

Echolocating bottlenose dolphins can discriminate among objects that vary in size, shape, and structure. Our purpose in this project is to investigate the features of echoes dolphins may use to determine each of these object properties (e.g., target strength, number, and position of highlights). A dolphin performed a cross-modal matching task where he was presented with an object in one modality (e.g., vision) and then asked to choose the same object from among a group of three objects using another modality (e.g., echolocation). The dolphin was presented with object sets in which the objects within the set varied along one feature (size or shape or material or texture). These objects were later ensonified with dolphin-like clicks and the object echoes were collected with a binaural (two hydrophone) system. To mimic the dolphin’s ability to scan across objects, echoes were captured as the objects were rotated. Differences in echoic features within object sets were examined along with the dolphin’s error. In addition, an artificial neural network (ANN) was used to classify objects using the object echoes. The ANN was utilized to explore the importance of spectral versus time domain features, and single versus multiple object orientations in object classification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.