Abstract

Imagery of facial expressions in Autism Spectrum Disorder (ASD) is likely impaired but has been very difficult to capture at a neurophysiological level. We developed an approach that allowed to directly link observation of emotional expressions and imagery in ASD, and to derive biomarkers that are able to classify abnormal imagery in ASD. To provide a handle between perception and action imagery cycles it is important to use visual stimuli exploring the dynamical nature of emotion representation. We conducted a case-control study providing a link between both visualization and mental imagery of dynamic facial expressions and investigated source responses to pure face-expression contrasts. We were able to replicate the same highly group discriminative neural signatures during action observation (dynamical face expressions) and imagery, in the precuneus. Larger activation in regions involved in imagery for the ASD group suggests that this effect is compensatory. We conducted a machine learning procedure to automatically identify these group differences, based on the EEG activity during mental imagery of facial expressions. We compared two classifiers and achieved an accuracy of 81% using 15 features (both linear and non-linear) of the signal from theta, high-beta and gamma bands extracted from right-parietal locations (matching the precuneus region), further confirming the findings regarding standard statistical analysis. This robust classification of signals resulting from imagery of dynamical expressions in ASD is surprising because it far and significantly exceeds the good classification already achieved with observation of neutral face expressions (74%). This novel neural correlate of emotional imagery in autism could potentially serve as a clinical interventional target for studies designed to improve facial expression recognition, or at least as an intervention biomarker.

Highlights

  • Faces represent a critical source of visual information for social perception, conveying relevant information about identity and emotional states of others (Kanwisher and Yovel, 2006)

  • event-related potential (ERP) Source Analysis Results The ERPs obtained from the visual stimulation task present two clear independent components, the first one peaking around 300 ms and the second around 600 ms (Figure 2)

  • We addressed for the first time facial expression (FE) imagery in Autism Spectrum Disorder (ASD) and identified a common neural correlate of observation and mental imagery (MI) of dynamic FEs in this condition, in the precuneus

Read more

Summary

Introduction

Faces represent a critical source of visual information for social perception, conveying relevant information about identity and emotional states of others (Kanwisher and Yovel, 2006). The perceptual strength and spatial frequency of the FE stimuli seem to be relevant to yield ASD group differences during simple visual presentation (Vlamings et al, 2010; Luckhardt et al, 2017), but the large majority of visual perception studies use static frame stimuli, lacking the dynamic characteristics of naturalistic FE (Monteiro et al, 2017) Those dynamics have been shown to play a crucial role on the perception of the respective FE and its emotional valence (Krumhuber et al, 2013) possibly because they allow to generate perception and action imagery cycles

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call