AbstractMonitoring body length and body condition of individuals helps determine overall population health and assess adaptation to environmental changes. Aerial photogrammetry from drone‐based videos is a valuable method for obtaining body length and body condition measurements of cetaceans. However, the laborious manual processing of drone‐based videos to select frames to measure animals ultimately delays assessment of population health and hinders conservation actions. Here, we apply deep learning methods to expedite the processing of drone‐based videos to improve efficiency of obtaining important morphological measurements of whales. We develop two user‐friendly models to automatically (1) detect and output frames containing whales from drone‐based videos (“DeteX”) and (2) extract body length and body condition measurements from input frames (“XtraX”). We use drone‐based videos of gray whales to compare manual versus automated measurements (n = 86). Our results show automated methods reduced processing times by one‐ninth, while achieving similar accuracy as manual measurements (mean coefficient of variation <5%). We also demonstrate how these methods are adaptable to other species and identify remaining challenges to help further improve automated measurements in the future. Importantly, these tools greatly speed up obtaining key morphological data while maintaining accuracy, which is critical for effectively monitoring population health.
Read full abstract