Abstract
Object constancy, the ability to recognize objects despite changes in orientation, has not been well studied in the auditory modality. Dolphins use echolocation for object recognition, and objects ensonified by dolphins produce echoes that can vary significantly as a function of orientation. In this experiment, human listeners had to classify echoes from objects varying in material, shape, and size that were ensonified with dolphin signals. Participants were trained to discriminate among the objects using an 18-echo stimulus from a 10° range of aspect angles, then tested with novel aspect angles across a 60° range. Participants were typically successful recognizing the objects at all angles (M = 78 %). Artificial neural networks were trained and tested with the same stimuli with the purpose of identifying acoustic cues that enable object recognition. A multilayer perceptron performed similarly to the humans and revealed that recognition was enabled by both the amplitude and frequency of echoes, as well as the temporal dynamics of these features over the course of echo trains. These results provide insight into representational processes underlying echoic recognition in dolphins and suggest that object constancy perceived through the auditory modality is likely to parallel what has been found in the visual domain in studies with both humans and animals.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.