Abstract

Marine predators are integral to the functioning of marine ecosystems, and their consumption requirements should be integrated into ecosystem-based management policies. However, estimating prey consumption in diving marine predators requires innovative methods as predator-prey interactions are rarely observable. We developed a novel method, validated by animal-borne video, that uses tri-axial acceleration and depth data to quantify prey capture rates in chinstrap penguins (Pygoscelis antarctica). These penguins are important consumers of Antarctic krill (Euphausia superba), a commercially harvested crustacean central to the Southern Ocean food web. We collected a large data set (n = 41 individuals) comprising overlapping video, accelerometer and depth data from foraging penguins. Prey captures were manually identified in videos, and those observations were used in supervised training of two deep learning neural networks (convolutional neural network (CNN) and V-Net). Although the CNN and V-Net architectures and input data pipelines differed, both trained models were able to predict prey captures from new acceleration and depth data (linear regression slope of predictions against video-observed prey captures = 1.13; R 2 ≈ 0.86). Our results illustrate that deep learning algorithms offer a means to process the large quantities of data generated by contemporary bio-logging sensors to robustly estimate prey capture events in diving marine predators.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.