Abstract

Wearable sensing technologies can be used for precision livestock production and to study foraging strategies to better understand the relationships between herbivores, vegetation, and landscape. In this context, monitoring grazing behavior (i.e. chew and bite events) can provide critical information for livestock management. This study presents a computational tool that utilizes wearable sensing and deep learning to distinguish chew and bite events in horses. A micro camera equipped with a microphone (0–18 kHz) was used to obtain video/audio data from horses during grazing. The collected audio data were treated in a pre-processing filtering step, then used to train a recurrent neural network (RNN) with a long short-term memory (LSTM) layer to detect and distinguish chews, bites, and noise events. A post-processing sliding window technique was used to filter events with low confidence levels and lengths. Initial evaluation of this system showed an accuracy of 88.64% for bite identification and an accuracy of 94.13% for chew identification. The distinction between events and evaluation of responses to different pasture species and structure can provide useful information on the plant-animal interface. That is aligned to information such as bite rate, bite mass, and grazing time, and can help determine management strategies that optimize intake and provide data for modeling foraging behavior to predict pasture use and animal performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call