Abstract

Patterns in pigs activity can be an indicator of health and welfare of the animals. This motivates researchers to develop Precision Livestock Farming (PLF) tools for automated monitoring of pig activity level. In this research we compared two important technologies that can be used for this purpose, ear tag accelerometer and computer vision. Additionally, we compared both technologies with gold standard based on human labelling. A state-of-the-art object detection algorithm RetinaNet was trained on 9969 images and validated on 4273 images to automatically detect head of a sow, body of a sow, left ear, right ear and a hay rack. It was possible to detect these objects with a performance of 0.26 mAP@0.5:0.95. Activity of 6 sows was derived from detected parts of animals’ bodies and compared with activity measurement based on ear tag accelerometer data. Dynamic relation between activity measurement based on both technologies was modelled with Transfer Function (TF) models. For all 6 animals activity of the body of a sow based on object detection was very similar to accelerometer based activity measurement (R2 > 0.7). Similarly R2 between activity of a head of a sow and accelerometer based activity was also very similar for most sows (R2 > 0.7). Results of fitting of TF models to animal activity data based on ear tag accelerometer and output of object detection on body of sows and head of sows suggests that both technologies, the accelerometer and computer vision provide very similar information on activity level of animals. The presented computer vision method is limited to monitoring one animal under camera view as detected body parts cannot be associated with multiple individuals. Moreover, we expect that the method requires re-training the RetinaNet object detection algorithm with additional images collected on additional farms to achieve satisfactory performance in different environments. Application of computer vision approach might be advantageous in some PLF applications as it is non-invasive and might be less laborious than method based on ear tag accelerometer data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.