Abstract

Studying and quantifying behaviour is important to understand how animals interact with their environments. However, manually extracting and analysing behavioural data from the large volume of camera footage collected is often time consuming. Deep learning techniques have emerged as useful tools in automating the analysis of certain behaviours under controlled or laboratory conditions, but the complexities of using raw footage from the field has resulted in this technology remaining largely unexplored as a possible data analysis alternative for animals in situ. Here, we use deep learning techniques to automate the analysis of fish grazing behaviour from real-world field imagery. We collected video footage in sea grass meadows in Queensland, Australia, and trained models on a training data set of over 3000 annotations. We used a combination of dense optical flow to assess pixel movement in underwater footage, spatiotemporal filtering to increase accuracy, and deep learning algorithms to classify grazing behaviour of luderick, Girella tricuspidata . When tested on novel videos the model had not seen in training, the model correctly identified nearly all individual grazing events. Deep learning shows promise as a viable tool for determining animal behaviour from underwater videos, and with further development offers an alternative to current time-consuming manual methods of data extraction. • Deep learning and dense optical flow can automate the analysis of behaviour. • This technique has a 92% accuracy in detecting grazing behaviour events. • Using spatiotemporal filtering postprocessing significantly increases performance. • Promising results call for further study into using automation of video analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call