Abstract

Knowledge of hearing abilities, as represented in audiograms, is vital for understanding animal acoustic physiology, behaviour, and ecology. Additionally, such knowledge plays an important role for measuring, predicting, or counteracting effects of anthropogenic noise on the environment. Currently, audiogram data is usually only available embedded in individual scientific publications and in various unstandardized formats, which makes access to and analysis of audiograms across sources cumbersome. We established a newly database that presents audiograms along with data on the audio-physiological experiments and original publications in a structured and easily accessible way. The interface enables combination of audiogram data for comparative analysis of different species, experimental conditions or publications. Focusing currently on marine vertebrates its content is the result of an extensive survey of the scientific literature and manual curation of the contained audio-physiological data. The database is designed to accommodate audiogram data on any biological group and purposed to be extended and serve as a reference source for audiogram data. It is publicly accessible at https://animalaudiograms.museumfuernaturkunde.berlin. [The database was developed as part of the project “Hearing in Penguins” funded by the German Environment Agency (UBA) with means from the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU, FKZ3717182440).]

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.