Abstract

Vocal expression of emotions has been observed across species and could provide a non-invasive and reliable means to assess animal emotions. We investigated if pig vocal indicators of emotions revealed in previous studies are valid across call types and contexts, and could potentially be used to develop an automated emotion monitoring tool. We performed an analysis of an extensive and unique dataset of low (LF) and high frequency (HF) calls emitted by pigs across numerous commercial contexts from birth to slaughter (7414 calls from 411 pigs). Our results revealed that the valence attributed to the contexts of production (positive versus negative) affected all investigated parameters in both LF and HF. Similarly, the context category affected all parameters. We then tested two different automated methods for call classification; a neural network revealed much higher classification accuracy compared to a permuted discriminant function analysis (pDFA), both for the valence (neural network: 91.5%; pDFA analysis weighted average across LF and HF (cross-classified): 61.7% with a chance level at 50.5%) and context (neural network: 81.5%; pDFA analysis weighted average across LF and HF (cross-classified): 19.4% with a chance level at 14.3%). These results suggest that an automated recognition system can be developed to monitor pig welfare on-farm.

Highlights

  • Vocal expression of emotions has been observed across species and could provide a non-invasive and reliable means to assess animal emotions

  • All Linear Mixed-Effects Models (LMM) revealed an effect of the valence for both low-frequency calls (LF) and highfrequency calls (HF) (Fig. 1; p ≤ 0.001 for all models)

  • Q50% (Fig. 1c) measured in LF calls was higher in positive contexts compared to negative contexts, while the opposite was found for high frequency (HF) calls (R2GLMM(m): LF = 0.05, HF = 0.04)

Read more

Summary

Introduction

Vocal expression of emotions has been observed across species and could provide a non-invasive and reliable means to assess animal emotions. We tested two different automated methods for call classification; a neural network revealed much higher classification accuracy compared to a permuted discriminant function analysis (pDFA), both for the valence (neural network: 91.5%; pDFA analysis weighted average across LF and HF (cross-classified): 61.7% with a chance level at 50.5%) and context (neural network: 81.5%; pDFA analysis weighted average across LF and HF (cross-classified): 19.4% with a chance level at 14.3%) These results suggest that an automated recognition system can be developed to monitor pig welfare on-farm. Systems for automatic acoustic recognition of physiological and stress states have already been developed for ­cattle[13,14] and ­pigs[15] These systems detect specific sounds (e.g. high-frequency calls), which may serve as first indicators of impaired ­welfare[16]. The real challenge remains to create a tool that can accurately identify the emotional states of the animals based on real-time call detection and classification in various environments

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call