Abstract

Abstract Animal acoustic signals are widely used in diverse research areas due to the relative ease with which sounds can be registered across a wide range of taxonomic groups and research settings. However, bioacoustics research can quickly generate large data sets, which might prove challenging to analyse promptly. Although many tools are available for the automated detection of sounds, choosing the right approach can be difficult and only a few tools provide a framework for evaluating detection performance. Here, we present ohun, an R package intended to facilitate automated sound event detection. ohun provides functions to diagnose and optimize detection routines, compare performance among different detection approaches and evaluate the accuracy in inferring the temporal location of events. The package uses reference annotations containing the time position of target sounds in a training data set to evaluate detection routine performance using common signal detection theory indices. This can be done both with routine outputs imported from other software and detections run within the package. The package also provides functions to organize acoustic data sets in a format amenable to detection analyses. In addition, ohun includes energy‐based and template‐based detection methods, two commonly used automatic approaches in bioacoustics research. We show how ohun can be used to automatically detect vocal signals with case studies of adult male zebra finch Taenopygia gutata songs and Spix's disc‐winged bat Thyroptera tricolor ultrasonic social calls. We also include examples of how to evaluate the detection performance of ohun and external software. Finally, we provide some general suggestions to improve detection performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call