The health of poultry flock is crucial in sustainable farming. Recent advances in machine learning and speech analysis have opened up opportunities for real-time monitoring of the behavior and health of flock. However, there has been little research on using Tiny Machine Learning (Tiny ML) for continuous vocalization monitoring in poultry. This study addresses this gap by developing and deploying Tiny ML models on low-power edge devices to monitor chicken vocalizations. The focus is on overcoming challenges such as memory limitations, processing power, and battery life to ensure practical implementation in agricultural settings. In collaboration with avian researchers, a diverse dataset of poultry vocalizations representing a range of health and environmental conditions was created to train and validate the algorithms. Digital Signal Processing (DSP) blocks of the Edge Impulse platform were used to generate spectral features for studying fowl vocalization. A one-dimensional Convolutional Neural Network (CNN) model was employed for classification. The study emphasizes accurately identifying and categorizing different chicken noises associated with emotional states such as discomfort, hunger, and satisfaction. To improve accuracy and reduce background noise, noise-robust Tiny ML algorithms were developed. Before the removal of background noise, our average accuracy and F1 scores were 91.6% and 0.92, respectively. After the removal, they improved to 96.6% and 0.95.
Read full abstract