Abstract

Facial expressions are a core component of the emotional response of social mammals. In contrast to Darwin's original proposition, expressive facial cues of emotion appear to have evolved to be species-specific. Faces trigger an automatic perceptual process, and so, inter-specific emotion perception is potentially a challenge; since observers should not try to “read” heterospecific facial expressions in the same way that they do conspecific ones. Using dynamic spontaneous facial expression stimuli, we report the first inter-species eye-tracking study on fully unrestrained participants and without pre-experiment training to maintain attention to stimuli, to compare how two different species living in the same ecological niche, humans and dogs, perceive each other’s facial expressions of emotion. Humans and dogs showed different gaze distributions when viewing the same facial expressions of either humans or dogs. Humans modulated their gaze depending on the area of interest (AOI) being examined, emotion, and species observed, but dogs modulated their gaze depending on AOI only. We also analysed if the gaze distribution was random across AOIs in both species: in humans, eye movements were not correlated with the diagnostic facial movements occurring in the emotional expression, and in dogs, there was only a partial relationship. This suggests that the scanning of facial expressions is a relatively automatic process. Thus, to read other species’ facial emotions successfully, individuals must overcome these automatic perceptual processes and employ learning strategies to appreciate the inter-species emotional repertoire.

Highlights

  • Faces are one of the main visual channels used to convey emotional information in humans (e.g., Smith and Schyns 2009), but face-based emotion recognition (FaBER) might be quite widespread in mammals (Tate et al 2006) due to its adaptive value

  • Human face-viewing gaze allocation was first dependent on the area of interest (AOI), the viewed face species, and the facial expressions

  • When humans looked at dog faces, the eyes and mouth were viewed in happiness and fear, and these were focused on more than all other AOIs

Read more

Summary

Introduction

Faces are one of the main visual channels used to convey emotional information in humans (e.g., Smith and Schyns 2009), but face-based emotion recognition (FaBER) might be quite widespread in mammals (Tate et al 2006) due to its adaptive value. A facial expression can be an intrinsic part of the emotional response and/or a more developed social communicative action (Frijda 1986). Inter-species emotion recognition potentially poses a challenge for individuals, as the context-specific emotional cues can be intra-specific (e.g., Caeiro et al 2017). The human–dog dyad is an ideal model to study intra-specific perception of emotional cues, due to their shared history and ecological niche (Skoglund et al 2015), and potential cognitive co-evolution (Hare 2007). Understanding how humans and dogs perceive each other’s facial cues of emotion has important implications for both human public safety and dog welfare

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call