Machine learning and sensor devices lined up with agriculture for the development of systems can efficiently provide real-time knowledge on animal behavior without the need of intense human observation, which is time consuming and labor demanding. In this study, we propose an intelligent system to classify the three important activities of sheep namely, “grazing”, “active”, and “inactive” states. We acquire primary data from Hebridean ewes using two types of sensors to capture the required activities. To address the problem of sensor heterogeneity in data and sensor orientation placement, we use convolutional neural networks (CNN) in conjunction with hand-crafted features, to improve the model generalization, specifically in terms of sensor orientation and position. Additionally, we utilise transfer learning (TL) for the model generalisation, which indicates substantial potential in future studies concerning animal activity recognition. More specifically, we use the TL to enable the reusability of pre-trained model for purely unseen data without performing model training and data labelling, which are highly time-consuming tasks. We performed experiments on datasets using CNN with automatic learned features, and on datasets using additional hand-crafted features. Our method obtained an overall accuracy of 98.55% on the source data (i.e., dataset captured via first sensor), and 96.59% on the target data (i.e., dataset captured via second sensor) when using the datasets which comprised the supplementary features. On the other hand, when CNN was applied to the raw datasets without any additional features, an accuracy of 97.46% and 94.79% was obtained on the source and target dataset respectively. This study is the first of its kind to propose convolutional neural network-based TL for the sheep activity recognition and demonstrate the significance of proposed approach in the context of data capturing, data labelling, and heterogeneity of sensor devices.
Read full abstract