Abstract

Pork is an essential source of protein in Taiwan and many other countries globally, and to meet increasing demand, maintaining the weaning rate of piglets is essential. Newborn piglets are relatively fragile and require more attention; however, manual observation is time-consuming and labor-intensive. This study aimed to develop an automated approach to recognize the lactating frequencies of sows, localize piglets, track individual piglets, and quantify their movements in videos. Embedded systems integrated with cameras were developed to capture bird’s-eye-view videos of sows and piglets in a farrowing house, which were then transmitted to a cloud server and converted to images. A combination of EfficientNet and long short-term memory (LSTM) was trained to recognize the lactation behavior from the videos, and a refined rotation RetinaNet (R3Det) model was trained to localize the piglets. Subsequently, the simple online and real-time tracking (SORT) algorithm was applied to track individual piglets and quantify their movements. The combination of EfficientNet and LSTM achieved an overall accuracy of 97.67% in lactation behavior recognition within a test time of 7.7 ms per 1-min video by using a GPU. The trained R3Det model achieved an overall mean average precision of 87.90%, precision of 93.52%, recall of 88.52%, and processing speed of 10.2 fps using a GPU. The piglet tracking using SORT achieved an overall multiple object tracking accuracy of 97.35%, multiple object tracking precision of 96.97%, IDF1 of 98.30%, and processing speed of 171.6 fps using a CPU. Thus, the feasibility of the proposed approaches to be used in typical pig farms was proven.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call