Abstract

Several approaches have been proposed for the analysis of pain-related facial expressions. These approaches range from common classification architectures based on a set of carefully designed handcrafted features, to deep neural networks characterised by an autonomous extraction of relevant facial descriptors and simultaneous optimisation of a classification architecture. In the current work, an end-to-end approach based on attention networks for the analysis and recognition of pain-related facial expressions is proposed. The method combines both spatial and temporal aspects of facial expressions through a weighted aggregation of attention-based neural networks’ outputs, based on sequences of Motion History Images (MHIs) and Optical Flow Images (OFIs). Each input stream is fed into a specific attention network consisting of a Convolutional Neural Network (CNN) coupled to a Bidirectional Long Short-Term Memory (BiLSTM) Recurrent Neural Network (RNN). An attention mechanism generates a single weighted representation of each input stream (MHI sequence and OFI sequence), which is subsequently used to perform specific classification tasks. Simultaneously, a weighted aggregation of the classification scores specific to each input stream is performed to generate a final classification output. The assessment conducted on both the BioVid Heat Pain Database (Part A) and SenseEmotion Database points at the relevance of the proposed approach, as its classification performance is on par with state-of-the-art classification approaches proposed in the literature.

Highlights

  • An individual’s affective disposition is often expressed throughout facial expressions.Human beings are able to assess someone’s current mood or state of mind by observing his or her facial demeanour

  • Accuracy where tp refers to true positives, tn refers to true negatives, f p refers to false positives and f n refers to false negatives

  • An approach based on a weighted aggregation of the scores of two deep attention networks based, respectively, on Motion History Images (MHIs) and Optical Flow Images (OFIs) has been proposed and evaluated for the analysis of pain-related facial expressions

Read more

Summary

Introduction

Human beings are able to assess someone’s current mood or state of mind by observing his or her facial demeanour. An analysis of facial expressions can provide some valuable insight about one’s emotional and psychological state. The current work focuses on the analysis of facial expressions for the assessment and recognition of pain in video sequences. A two-stream attention network is designed, with the objective of combining both temporal and spatial aspects of facial expressions, based on sequences of motion history images [1] and optical flow images [2], to accurately discriminate between neutral, low, and high levels of nociceptive pain. An overview of pain recognition approaches based on facial expressions is provided, followed by a thorough An overview of pain recognition approaches based on facial expressions is provided in Section 2, followed by a thorough

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call