Abstract

Abstract With recent technological advances, Global Positioning Systems (GPS) and other tracking sensors can collect large amounts of data and transfer information to the cloud and then to producers to remotely monitor livestock health and well-being. Currently, supervised machine learning, such as random forest and linear or quadratic discriminant analysis techniques are used to develop algorithms to identify changes in animal behavior that may be associated with well-being concerns. Supervised machine learning requires behavior observation of monitored animals and may hinder normal expression. However, recording behavioral observations is time consuming and expensive. Our goal was to design a new unsupervised machine learning framework to identify animal behavior without the utilization of human observations. The framework contains two steps. The first step is to segment the tracking data of the animal using time series segmentation, and the second step is to group the segments into clusters where each cluster represents one type of behavior. To validate the applicability of our proposed framework, we utilize GPS tracking data collected at 2-minute intervals from eight cows from May 28 to June 22, 2018, in a 1096 ha rangeland pasture near Prescott, Arizona. After extensive experiments, our framework can partition the movement of the cow using the speed, direction, and distance of the cow from water. These segments were grouped into meaningful behavior clusters using clustering analysis. Speed was the most successful feature for clustering into behaviors. Results are similar to approaches based only on expert knowledge, that rely on speed for classification. Our study demonstrated that we can directly use unlabeled data to group animal behaviors from GPS tracking data. The proposed unsupervised two-step framework allows the analysis of cattle tracking data without direct human observation of behaviors. It is applicable for analyzing the immense amount of data that can be obtained from real-time tracking and sensor devices.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.