Abstract

Facial expression is a common channel for the communication of emotion. However, in the case of non-human animals, the analytical methods used to quantify facial expressions can be subjective, relying heavily on extrapolation from human-based systems. Here, we demonstrate how geometric morphometrics can be applied in order to overcome these problems. We used this approach to identify and quantify changes in facial shape associated with pain in a non-human animal species. Our method accommodates individual variability, species-specific facial anatomy, and postural effects. Facial images were captured at four different time points during ovariohysterectomy of domestic short haired cats (n = 29), with time points corresponding to varying intensities of pain. Images were annotated using landmarks specifically chosen for their relationship with underlying musculature, and relevance to cat-specific facial action units. Landmark data were subjected to normalisation before Principal Components (PCs) were extracted to identify key sources of facial shape variation, relative to pain intensity. A significant relationship between PC scores and a well-validated composite measure of post-operative pain in cats (UNESP-Botucatu MCPS tool) was evident, demonstrating good convergent validity between our geometric face model, and other metrics of pain detection. This study lays the foundation for the automatic, objective detection of emotional expressions in a range of non-human animal species.

Highlights

  • Neonates, unlike verbally-capable humans, cannot self-report distressing experiences

  • Differences in facial expression are classified in terms of ‘action units’, with various photographic[4,8,13,14] and illustrative scales[9,10,11] generated to depict expressions associated with pain, as well as its supposed intensity

  • The dataset relating to the cats used for this study was collected previously for the purposes of validating a composite pain scale in domestic cats, not involving the face, and its use was approved by the Institutional Animal Research Ethical Committee of the FMVZ-UNESP-Botucatu under the protocol number of 20/2008

Read more

Summary

Introduction

Unlike verbally-capable humans, cannot self-report distressing experiences. With the exception of one recent study in ferrets[14], the action units reported have relied upon extrapolation from human systems (i.e. the human Facial Action Coding System (FACS))[15] In so doing, these studies appropriate the descriptors developed to quantify changes in human facial muscles, and lack an understanding of species-specific underlying facial musculature, and the associated repertoire www.nature.com/scientificreports/. This is potentially problematic given that the facial expression of similar emotional states, and associated musculature, can differ between humans and non-human animals[16]. These previous reports are subject to a range of potential anthropocentric biases, including the expressions attended to, and their subsequent interpretation

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call