Abstract

Simple SummaryAnimals exhibit their internal and external stimuli through changing behavior. Therefore, people intrinsically used animal physical activities as an indicator to determine their health and welfare status. A deep-learning-based pig posture and locomotion activity detection and tracking algorithm were designed to measure those behavior changes in an experimental pig barn at different greenhouse gas (GHG) levels. The naturally occurring GHGs in the livestock were elevated by closing ventilators for an hour in the morning, during the day, and at nighttime. Additionally, the corresponding pig posture and locomotion activity were measured before, during, and after an hour of treatment. With the increase in GHG concentration, the pigs became less active, increasing their lateral-lying posture duration. In addition, standing, sternal-lying, and walking activities were decreased with the increment in GHG levels. Therefore, monitoring and tracking pigs’ physical behaviors using a simple RGB camera and a deep-learning object detection model, coupled with a real-time tracking algorithm, would effectively monitor the individual pigs’ health and welfare.Pig behavior is an integral part of health and welfare management, as pigs usually reflect their inner emotions through behavior change. The livestock environment plays a key role in pigs’ health and wellbeing. A poor farm environment increases the toxic GHGs, which might deteriorate pigs’ health and welfare. In this study a computer-vision-based automatic monitoring and tracking model was proposed to detect pigs’ short-term physical activities in the compromised environment. The ventilators of the livestock barn were closed for an hour, three times in a day (07:00–08:00, 13:00–14:00, and 20:00–21:00) to create a compromised environment, which increases the GHGs level significantly. The corresponding pig activities were observed before, during, and after an hour of the treatment. Two widely used object detection models (YOLOv4 and Faster R-CNN) were trained and compared their performances in terms of pig localization and posture detection. The YOLOv4, which outperformed the Faster R-CNN model, was coupled with a Deep-SORT tracking algorithm to detect and track the pig activities. The results revealed that the pigs became more inactive with the increase in GHG concentration, reducing their standing and walking activities. Moreover, the pigs shortened their sternal-lying posture, increasing the lateral lying posture duration at higher GHG concentration. The high detection accuracy (mAP: 98.67%) and tracking accuracy (MOTA: 93.86% and MOTP: 82.41%) signify the models’ efficacy in the monitoring and tracking of pigs’ physical activities non-invasively.

Highlights

  • Pig behavior is a key trait for recognizing their health and welfare conditions [1]

  • CO2 is the dominant greenhouse gases (GHGs), followed by carbon monoxide (CO) and nitric oxide (NO), whereas N2O was found in the lowest concentration in this experimental pig barn

  • A big-sized feeder structure obstructed capturing all the pigs throughout the study period, especially those who stayed inside the feeder

Read more

Summary

Introduction

Pig behavior is a key trait for recognizing their health and welfare conditions [1]. Regular monitoring of pigs’ physical activity is essential to identify short- and long-term pig stresses [2]. The monitoring of pigs round-the-clock in precision farming provides invaluable information regarding their physical and biological status, manual monitoring of every single pig in a large-scale commercial farm is impractical due to the requirement for a higher animal-to-staff ratio, increasing production cost. The staff can only observe the pig briefly and might miss identifying subtle changes in the pigs’ activity [3]. The presence of a human in the barn influences the pigs’ behavior, leading to unusual activity that can be misunderstood during the decision-making process [4,5]. Sensor-based non-disturbing automatic monitoring of pigs is being used considerably

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call