Abstract

With the vigorous development of intelligence agriculture, the progress of automated large-scale and intensive pig farming has accelerated significantly. As a biological feature, the pig face has important research significance for precise breeding of pigs and traceability of health. In the management of live pigs, many managers adopt traditional methods, including color marking and RFID identification, but there will be problems such as off-label, mixed-label and waste of manpower. This work proposes a non-invasive way to study the identification of multiple individuals in pigs. The model was to first replace the original backbone network of YOLOv4 with MobileNet-v3, a popular lightweight network. Then depth-wise separable convolution was adopted in YOLOv4′s feature extraction network SPP and PANet to further reduce network parameters. Moreover, CBAM attention mechanism formed by the concatenation of CAM and SAM was added to PANet to ensure the network accuracy while reducing the model weight. The introduction of multi-attention mechanism selectively strengthened key areas of pig face and filtered out weak correlation features, so as to improve the overall model effect. Finally, an improved MobileNetv3-YOLOv4-PACNet (M-YOLOv4-C) network model was proposed to identify individual sows. The mAP were 98.15 %, the detection speed FPS were 106.3frames/s, and the model parameter size was only 44.74 MB, which can be well implanted into the small-volume pig house management sensors and applied to the pig management system in a lightweight, fast and accurate manner. This model will provide model support for subsequent pig behavior recognition and posture analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.