Abstract

As the poultry farming industry attaches great importance to the welfare of poultry, an understanding of the daily behaviours of poultry is becoming more important. Eating behaviour is an important daily behaviour in poultry. Based on the difference between the eating vocalization and the normal vocalization of poultry, this study proposed an automatic detection method for the eating behaviour of poultry based on an audio analysis and a time sequence model. First, the short time energy (STE) and short time zero crossing rates (STZ) of poultry vocalizations and environmental sounds were analysed, and a four-threshold poultry vocalization automatic selection method based on the STE and STZ was proposed. Then, these poultry vocalizations were characterized according to variations in time. Three types of poultry vocalization network (PV-net) were proposed for classify the eating vocalization and the normal vocalization of poultry: PV-net1, PV-net2 and PV-net3. In the experiments, vocalization of 18 chickens was collected. Results showed that the recognition rates of PV-net1, PV-net2 and PV-net3 were 93.5%, 94.5% and 96%, respectively; their sensitivities were 96%, 97% and 96%, respectively; and their specificities were 91%, 92% and 96%, respectively. This method is beneficial to the protection of poultry welfare and will be significant for use in monitoring and controlling the eating behaviour of poultry in the poultry farming industry.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.