Abstract

Simple SummaryThere are various systems available for health monitoring and heat detection in dairy cows. By continuously monitoring different behavioral patterns (e.g., lying, ruminating, and feeding), these systems detect behavioral changes linked to health disorders and estrous. Most of the systems were developed for cows kept indoors, and only a few systems are available for pasture-based farms. The systems developed for the barn failed to detect the targeted behavior and thereby its changes on the pasture and vice versa. Therefore, our goal was to train and validate a machine learning model for the automated prediction of lying behavior in dairy cows kept on pastures, as well as indoors. Data collection was conducted on three dairy farms where cows were equipped with the collar-based prototype of the monitoring system and recorded with cameras in parallel. The derived dataset was used to develop the machine learning model. The model performed well in predicting lying behavior in dairy cows both on the pasture and in the barn. Therefore, the building of the model presents a successful first step towards the development of a monitoring system for dairy cows kept on pasture and in the barn.Monitoring systems assist farmers in monitoring the health of dairy cows by predicting behavioral patterns (e.g., lying) and their changes with machine learning models. However, the available systems were developed either for indoors or for pasture and fail to predict the behavior in other locations. Therefore, the goal of our study was to train and evaluate a model for the prediction of lying on a pasture and in the barn. On three farms, 7–11 dairy cows each were equipped with the prototype of the monitoring system containing an accelerometer, a magnetometer and a gyroscope. Video observations on the pasture and in the barn provided ground truth data. We used 34.5 h of datasets from pasture for training and 480.5 h from both locations for evaluating. In comparison, random forest, an orientation-independent feature set with 5 s windows without overlap, achieved the highest accuracy. Sensitivity, specificity and accuracy were 95.6%, 80.5% and 87.4%, respectively. Accuracy on the pasture (93.2%) exceeded accuracy in the barn (81.4%). Ruminating while standing was the most confused with lying. Out of individual lying bouts, 95.6 and 93.4% were identified on the pasture and in the barn, respectively. Adding a model for standing up events and lying down events could improve the prediction of lying in the barn.

Highlights

  • Precision Livestock Farming (PLF) has gained importance in the dairy sector all overEurope over the last decade

  • This process is used in monitoring systems for dairy cows in order to predict the behavior of animals continuously and individually

  • We evaluated whether the behavior recognition is suitable for higher level behavior analytics

Read more

Summary

Introduction

Precision Livestock Farming (PLF) has gained importance in the dairy sector all overEurope over the last decade. Different studies investigated the automizing activity recognition of humans [1,2,3] or animals [4,5] in real environments These studies are based on technologies such as smart environment, Internet of Things (IoT), machine learning and big data. The standard strategy applied in these studies is to use sensors and to classify the sensor data into desired activities or behavioral patterns by applying suitable machine learning models on the sensor data. This process is used in monitoring systems for dairy cows in order to predict the behavior of animals continuously and individually. As presented by Kamminga et al [8] and Krause et al [9], a combination of accelerometer, magnetometer and gyroscope data can provide an orientation-independent dataset that ensures sufficient accuracy of the model, even when a low sampling rate is applied

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call