AbstractPassive acoustic monitoring using Autonomous Recording Units (ARUs) is becoming a significant research tool for collecting large amounts of ecological data. Northern bobwhite Colinus virginianus is an economically important game bird whose declining populations are of conservation concern, so efforts to monitor bobwhite abundance using ARUs are being intensified. Yet, manual processing of ARU data is time consuming and often expensive, so developing automatic call detection methods is a key step in acoustic monitoring. We present here the first single species convolutional neural network (CNN) developed purely for automatic bobwhite covey call identification and classification. We demonstrate the value of meaningful data augmentation by including non‐target calls and background noise into our training dataset, as well as evaluating alternative CNN score thresholds and model extrapolation performance. We trained our CNN on 6,682 manually labeled covey calls across three groups of sites within the southeastern USA. Precision and AUC from both CNN classification and individual call detection was high (0.80–0.99), and our model showed strong extrapolation ability across site groups. However, extrapolation performance significantly decreased for sites that were more dissimilar to the training data set if our meaningful data augmentation process was omitted. Our CNN detected significantly more covey calls than manual labeling using Raven Pro software, and processing time was greatly reduced: a single one hour wav file can be now analyzed by the CNN in roughly eight seconds. We also demonstrate using a simple case study that extremely high variability in estimates of bobwhite site occupancy and detection are obtained depending on the method of acoustic data processing (manual versus CNN). Our results suggest that our CNN provides robust and time‐saving analysis of bobwhite covey call acoustic data and can be applied to future research and monitoring projects with high confidence in the performance of the model.
Read full abstract