Context transformer with multiscale fusion for robust facial emotion recognition

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Context transformer with multiscale fusion for robust facial emotion recognition

Similar Papers
  • Research Article
  • Cite Count Icon 10
  • 10.2147/ndt.s106989
Investigation of facial emotion recognition, alexithymia, and levels of anxiety and depression in patients with somatic symptoms and related disorders
  • Apr 29, 2016
  • Neuropsychiatric Disease and Treatment
  • Ahmet Ozturk + 3 more

BackgroundThe concept of facial emotion recognition is well established in various neuropsychiatric disorders. Although emotional disturbances are strongly associated with somatoform disorders, there are a restricted number of studies that have investigated facial emotion recognition in somatoform disorders. Furthermore, there have been no studies that have regarded this issue using the new diagnostic criteria for somatoform disorders as somatic symptoms and related disorders (SSD). In this study, we aimed to compare the factors of facial emotion recognition between patients with SSD and age- and sex-matched healthy controls (HC) and to retest and investigate the factors of facial emotion recognition using the new criteria for SSD.Patients and methodsAfter applying the inclusion and exclusion criteria, 54 patients who were diagnosed with SSD according to the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) criteria and 46 age- and sex-matched HC were selected to participate in the present study. Facial emotion recognition, alexithymia, and the status of anxiety and depression were compared between the groups.ResultsPatients with SSD had significantly decreased scores of facial emotion for fear faces, disgust faces, and neutral faces compared with age- and sex-matched HC (t=−2.88, P=0.005; t=−2.86, P=0.005; and t=−2.56, P=0.009, respectively). After eliminating the effects of alexithymia and depressive and anxious states, the groups were found to be similar in terms of their responses to facial emotion and mean reaction time to facial emotions.DiscussionAlthough there have been limited numbers of studies that have examined the recognition of facial emotion in patients with somatoform disorders, our study is the first to investigate facial recognition in patients with SSD diagnosed according to the DSM-5 criteria. Recognition of facial emotion was found to be disturbed in patients with SSD. However, our findings suggest that disturbances in facial recognition were significantly associated with alexithymia and the status of depression and anxiety, which is consistent with the previous studies. Further studies are needed to highlight the associations between facial emotion recognition and SSD.

  • Research Article
  • Cite Count Icon 6
  • 10.3390/diagnostics12071721
Is There a Difference in Facial Emotion Recognition after Stroke with vs. without Central Facial Paresis?
  • Jul 15, 2022
  • Diagnostics
  • Anna-Maria Kuttenreich + 2 more

The Facial Feedback Hypothesis (FFH) states that facial emotion recognition is based on the imitation of facial emotional expressions and the processing of physiological feedback. In the light of limited and contradictory evidence, this hypothesis is still being debated. Therefore, in the present study, emotion recognition was tested in patients with central facial paresis after stroke. Performance in facial vs. auditory emotion recognition was assessed in patients with vs. without facial paresis. The accuracy of objective facial emotion recognition was significantly lower in patients with vs. without facial paresis and also in comparison to healthy controls. Moreover, for patients with facial paresis, the accuracy measure for facial emotion recognition was significantly worse than that for auditory emotion recognition. Finally, in patients with facial paresis, the subjective judgements of their own facial emotion recognition abilities differed strongly from their objective performances. This pattern of results demonstrates a specific deficit in facial emotion recognition in central facial paresis and thus provides support for the FFH and points out certain effects of stroke.

  • Research Article
  • Cite Count Icon 103
  • 10.1016/j.neubiorev.2021.104518
A systematic review and meta-analysis of facial emotion recognition in autism spectrum disorder: The specificity of deficits and the role of task characteristics
  • Dec 31, 2021
  • Neuroscience and biobehavioral reviews
  • Michael K Yeung

A systematic review and meta-analysis of facial emotion recognition in autism spectrum disorder: The specificity of deficits and the role of task characteristics

  • Research Article
  • 10.2478/rjap-2024-0001
The Emotional Toll of HIV: Exploring Facial and Auditory Emotion Recognition and Emotion Regulation in People Living With HIV
  • Jan 1, 2024
  • Romanian Journal of Applied Psychology
  • Rakesh Kumar Singh + 2 more

People living with Human Immunodeficiency Virus (PLHIV) have been reported to show poor facial emotion recognition. However, these studies presented participants with facial emotion photographs whereas in real life facial emotion recognition hardly involves inferring emotions from static faces. Moreover, emotion recognition from other sensory modalities, such as auditory, has hardly been explored. There’s also a dearth of studies examining emotion regulation difficulties in this group. The present study, thus, explored facial (using facial emotion videos) and auditory emotion recognition as well as difficulties in emotion regulation (using the Hindi version of Difficulties in Emotion Regulation Scale) in 60 PLHIV and 60 people without HIV (PWoHIV). Additionally, the association of HIV duration (duration since diagnosis of HIV), viral load, and Clusters of differentiation 4 (CD4) count with emotion recognition and regulation difficulties in PLHIV was explored. Findings from one-way ANCOVA (with education and socioeconomic status as covariates) revealed significantly impaired auditory emotion recognition (particularly for fear) among PLHIV than PWoHIV. The former also showed significantly poorer facial emotion recognition for surprise. PLHIV also self-reported significantly more emotion regulation difficulties than PWoHIV, specifically Nonacceptance of their response to negative emotions and limited access to emotion regulation Strategies. CD4 count was negatively correlated with emotion regulation difficulties, particularly for accomplishing goal-directed behaviour when experiencing negative emotions (Goals) and Strategies. Besides the novel addition to the literature regarding impaired auditory emotion recognition in PLHIV, these findings can help develop targeted interventions to improve emotion recognition and emotion regulation for PLHIV.

  • Research Article
  • Cite Count Icon 51
  • 10.1017/s1355617714000939
Facial and bodily emotion recognition in multiple sclerosis: the role of alexithymia and other characteristics of the disease.
  • Nov 1, 2014
  • Journal of the International Neuropsychological Society
  • Cinzia Cecchetto + 6 more

Multiple sclerosis (MS) may be associated with impaired perception of facial emotions. However, emotion recognition mediated by bodily postures has never been examined in these patients. Moreover, several studies have suggested a relation between emotion recognition impairments and alexithymia. This is in line with the idea that the ability to recognize emotions requires the individuals to be able to understand their own emotions. Despite a deficit in emotion recognition has been observed in MS patients, the association between impaired emotion recognition and alexithymia has received little attention. The aim of this study was, first, to investigate MS patient's abilities to recognize emotions mediated by both facial and bodily expressions and, second, to examine whether any observed deficits in emotions recognition could be explained by the presence of alexithymia. Thirty patients with MS and 30 healthy matched controls performed experimental tasks assessing emotion discrimination and recognition of facial expressions and bodily postures. Moreover, they completed questionnaires evaluating alexithymia, depression, and fatigue. First, facial emotion recognition and, to a lesser extent, bodily emotion recognition can be impaired in MS patients. In particular, patients with higher disability showed an impairment in emotion recognition compared with patients with lower disability and controls. Second, their deficit in emotion recognition was not predicted by alexithymia. Instead, the disease's characteristics and the performance on some cognitive tasks significantly correlated with emotion recognition. Impaired facial emotion recognition is a cognitive signature of MS that is not dependent on alexithymia.

  • Research Article
  • Cite Count Icon 9
  • 10.3390/diagnostics12051138
Facial Emotion Recognition in Patients with Post-Paralytic Facial Synkinesis-A Present Competence.
  • May 4, 2022
  • Diagnostics (Basel, Switzerland)
  • Anna-Maria Kuttenreich + 4 more

Facial palsy is a movement disorder with impacts on verbal and nonverbal communication. The aim of this study is to investigate the effects of post-paralytic facial synkinesis on facial emotion recognition. In a prospective cross-sectional study, we compared facial emotion recognition between n = 30 patients with post-paralytic facial synkinesis (mean disease time: 1581 ± 1237 days) and n = 30 healthy controls matched in sex, age, and education level. Facial emotion recognition was measured by the Myfacetraining Program. As an intra-individual control condition, auditory emotion recognition was assessed via Montreal Affective Voices. Moreover, self-assessed emotion recognition was studied with questionnaires. In facial as well as auditory emotion recognition, on average, there was no significant difference between patients and healthy controls. The outcomes of the measurements as well as the self-reports were comparable between patients and healthy controls. In contrast to previous studies in patients with peripheral and central facial palsy, these results indicate unimpaired ability for facial emotion recognition. Only in single patients with pronounced facial asymmetry and severe facial synkinesis was an impaired facial and auditory emotion recognition detected. Further studies should compare emotion recognition in patients with pronounced facial asymmetry in acute and chronic peripheral paralysis and central and peripheral facial palsy.

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.rasd.2024.102400
The role of emotional factors in face processing abilities in autism spectrum conditions
  • May 3, 2024
  • Research in Autism Spectrum Disorders
  • Natasha Baxter + 1 more

Facial emotion recognition is considered atypical in individuals with autism spectrum conditions (ASC), but emotion recognition abilities vary widely in autistic people, and there are inconsistent findings on the causes of these differences. Research indicates alexithymia may result in facial emotion recognition differences in ASC. Alternatively, mood disorders have been linked to atypical facial emotional expression recognition abilities in neurotypical adults. Investigating both the effects of alexithymia and mood disorders (depression and anxiety) is necessary to establish which of these factors may cause atypical facial emotion recognition in ASC. This study aimed to examine whether alexithymia or mood disorder symptomology is a predictor of atypical facial emotion recognition in individuals with ASC. Ninety-eight non-autistic adults and 80 autistic adults were recruited. Participants completed an online facial processing task to examine emotion and identity recognition abilities, the AQ-28, the TAS-20, and the HADS to measure autism severity, alexithymia symptoms, and depression and anxiety symptoms. Regression-based analyses found that autistic traits and autistic group membership did not predict facial emotion processing abilities after accounting for demographic variables, alexithymia and mood disorders: however, neither alexithymia nor mood disorder symptoms predicted variance in face processing abilities either. Our results concur with previous meta-analyses of facial emotion processing in autism spectrum disorder which report that studies do not always report deficits in face processing in autism: our findings are also not supportive of the model that argues that alexithymia explains facial emotion processing difficulties in autism.

  • Research Article
  • Cite Count Icon 17
  • 10.1016/j.jad.2016.08.068
The effect of comorbid depression on facial and prosody emotion recognition in first-episode schizophrenia spectrum
  • Oct 15, 2016
  • Journal of Affective Disorders
  • Sarah E Herniman + 4 more

The effect of comorbid depression on facial and prosody emotion recognition in first-episode schizophrenia spectrum

  • Research Article
  • Cite Count Icon 136
  • 10.1097/jgp.0b013e318165dbce
Facial Emotion Recognition Deficit in Amnestic Mild Cognitive Impairment and Alzheimer Disease
  • May 1, 2008
  • The American Journal of Geriatric Psychiatry
  • Ilaria Spoletini + 9 more

Facial Emotion Recognition Deficit in Amnestic Mild Cognitive Impairment and Alzheimer Disease

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 1
  • 10.31661/gmj.v4i3.358
A Comparative Study of the Ability of Facial Emotional Expression Recognition and its Relationship with Communication Skills in Iranian Patients with Mood Disorders
  • Jul 25, 2015
  • Galen Medical Journal
  • Seyed Hamid Seyednezhad Golkhatmi + 4 more

Background: Facial emotion recognition impairment in psychiatric patients such as those with mood disorders and impaired communication skills in these patients is one of the most important issues. The present study aims to evaluate and compare facial emotion recognition among patients with depression, bipolar disorder who experience manic phase and the subjects of the normal group without a diagnosis of a disorder. Moreover, the present study aims to evaluate and compare the relationship between facial emotion recognition ability and communication skills among these patients. Materials and Methods: Participants of this study included 30 patients with depression, 30 patients with bipolar disorder and 30 subjects from a normal group; a total of 90 subjects who were selected using convenience sampling method. PC version of Ekman’s facial emotion test (1976) and Queendom’s interpersonal communication skills test (2004) were used to collect data. Data were analyzed using statistical tests of correlation, one way analysis of variance and Tukey’s post hoc test. Results: The findings showed that there was a significant difference between facial emotion recognition in patients with mood disorder and the normal group. Moreover, there was a correlation between facial emotion recognition and communication skills among these patients. Conclusions: Based on the results of this study on facial emotion recognition impairment and its significant relationship with communication skills in patients with mood disorder, it can be said that paying attention to them is very important in treating these disorders and reducing the relapse of the disease.[GMJ. 2015;4(3):90-99]

  • Research Article
  • Cite Count Icon 65
  • 10.1016/j.eswa.2014.08.042
Adaptive 3D facial action intensity estimation and emotion recognition
  • Sep 16, 2014
  • Expert Systems with Applications
  • Yang Zhang + 2 more

Adaptive 3D facial action intensity estimation and emotion recognition

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 69
  • 10.1080/02699931.2020.1815655
Facial mimicry, empathy, and emotion recognition: a meta-analysis of correlations
  • Sep 13, 2020
  • Cognition and Emotion
  • Alison C Holland + 2 more

A number of prominent theories have linked tendencies to mimick others’ facial movements to empathy and facial emotion recognition, but evidence for such links is uneven. We conducted a meta-analysis of correlations of facial mimicry with empathy and facial emotion recognition skills. Other factors were also examined for moderating influence, e.g. facets of empathy measured, facial muscles recorded, and facial emotions being mimicked. Summary effects were estimated with a random-effects model and a meta-regression analysis was used to identify factors moderating these effects. 162 effects from 28 studies were submitted. The summary effect size indicated a significant weak positive relationship between facial mimicry and empathy, but not facial emotion recognition. The moderator analysis revealed that stronger correlations between facial mimicry and empathy were observed for static vs. dynamic facial stimuli, and for implicit vs. explicit instances of facial emotion processing. No differences were seen between facial emotions, facial muscles, emotional and cognitive facets of empathy, or state and trait measures of empathy. The results support the claim that stronger facial mimicry responses are positively related to higher dispositions for empathy, but the weakness and variability of this effect suggest that this relationship is conditional on not-fully understood factors.

  • Research Article
  • Cite Count Icon 4
  • 10.25139/inform.v7i1.4282
A Survey on Deep Learning Algorithms in Facial Emotion Detection and Recognition
  • Jan 20, 2022
  • Inform : Jurnal Ilmiah Bidang Teknologi Informasi dan Komunikasi
  • Prince Awuah Baffour + 3 more

Facial emotion recognition (FER) forms part of affective computing, where computers are trained to recognize human emotion from human expressions. Facial Emotion Recognition is very necessary for bridging the communication gap between humans and computers because facial expressions are a form of communication that transmits 55% of a person's emotional and mental state in a total face-to-face communication spectrum. Breakthroughs in this field also make computer systems (robotic systems) better serve or interact with humans. Research has far advanced for this cause, and Deep learning is at its heart. This paper systematically discusses state-of-the-art deep learning architectures and algorithms for facial emotion detection and recognition. The paper also reveals the dominance of CNN architectures over other known architectures like RNNs and SVMs, highlighting the contributions, model performance, and limitations of the reviewed state-of-the-art. It further identifies available opportunities and open issues worth considering by various FER research in the future. This paper will also discover how computation power and availability of large facial emotion datasets have also limited the pace of progress.

  • Research Article
  • Cite Count Icon 8
  • 10.25139/inform.v7i1.4563
A Survey on Deep Learning Algorithms in Facial Emotion Detection and Recognition
  • Jun 8, 2022
  • Inform : Jurnal Ilmiah Bidang Teknologi Informasi dan Komunikasi
  • Prince Awuah Baffour + 3 more

Facial emotion recognition (FER) forms part of affective computing, where computers are trained to recognize human emotion from human expressions. Facial Emotion Recognition is very necessary for bridging the communication gap between humans and computers because facial expressions are a form of communication that transmits 55% of a person's emotional and mental state in a total face-to-face communication spectrum. Breakthroughs in this field also make computer systems (robotic systems) better serve or interact with humans. Research has far advanced for this cause, and Deep learning is at its heart. This paper systematically discusses state-of-the-art deep learning architectures and algorithms for facial emotion detection and recognition. The paper also reveals the dominance of CNN architectures over other known architectures like RNNs and SVMs, highlighting the contributions, model performance, and limitations of the reviewed state-of-the-art. It further identifies available opportunities and open issues worth considering by various FER research in the future. This paper will also discover how computation power and availability of large facial emotion datasets have also limited the pace of progress.

  • Research Article
  • 10.6084/m9.figshare.11363765
Facial Recognition and Its Use-Cases in Online Education
  • Jan 1, 2019
  • Fitboy Awesome

Biometric is no longer limited to attendance and smartphone screen locks. Numerous sectors are implementing biometrics as a means of authentication. According to the chief executive of Biometric Institute, Isabelle Moeller, explains“Sectors including mobile, banking, and education are showing a real appetite for the use of Biometric authentication.” Face as Biometric Identity Like any other method, facial authentication serves as a biometric authentication mechanism. But the difference is that it is more reliable and easy. Digital cameras are available in almost every smartphone making face identification as easy as the blink of an eye. Unlike fingerprint and it serves more purposes than mere identification. Paid e-libraries can ensure the security of their content and avoid copyright infringements. Usually protected by simple password authentication, e-libraries have a high risk of identity fraud. The credentials of the user can be used by someone else to access content and use it for malicious purposes. Biometric as ID for Assessment and personalisation Facial recognition can help in identifying learner in many scenarios. One such scenario is taking online assessments. Online education faces a lot of criticism due to the high occurrence of cheating. A user could ask anyone to take a test on their behalf and get a good score. Would not that be unjustified for the students who work hard to gain marks? Facial recognition has the potential to make online assessments viable. Another use case is to enable serving personalised content to learners. With online facial recognition technology, capturing usage data for analytics and using it to make content more personalised becomes more possible. Learner Engagement using Facial Emotions Although, not currently in use but Emotion recognition is a possible use of online facial recognition in the future. Amazon has launched Recognition’ an AI that will detect the emotions of customers and analyse what should be done to make the customer more satisfied. In classroom-based scenarios, teachers make sure that students are showing interest in learning. How does a teacher make sure? Obviously, they predict by emotional expressions of students. Facial emotional recognition could take this to online learning. By using AI, the facial emotions of a learner could be used to analyse their engagement during learning. The integrity of online education is questioned a lot. Especially, with increased issues of cheating and plagiarism. In online learning, no suitable checks are there to detect whether the person who enrol in the course, takes exams or someone else on his behalf sits in exams. This malign the integrity of the education sector. Read More:https://shuftipro.com/biometric-consent-verification/

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon