Influence of anger on the evaluation of emotional congruence between scenes and facial expressions

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Individuals with high levels of anger exhibit difficulties in executive attention and in inhibiting dominant responses and/or activating subdominant ones, which may influence their appraisal of emotional congruence. This study aimed to analyze the relationship between anger levels associated with aggression and the evaluation of emotional congruence between scenes and facial expressions. The International Affective Picture System (IAPS) was used to select emotional scenes (Positive, Violent, Repulsive, Neutral), and the NimStim Face Stimulus Set was used to select emotional facial expressions (happiness, anger, fear, disgust, neutral). Additionally, the Buss and Perry (1992) Aggression Questionnaire was used to assess anger levels (lower, middle, upper). Participants with higher anger levels (compared to those with lower levels) showed longer response times when rating emotional congruence between scenes and facial expressions. The effect size was large, with stronger effects for negative congruence (r> .50) than for positive congruence (r= .33). These findings may be explained by difficulties in attentional control among participants with higher anger levels, particularly when processing negative information. Las personas con altos niveles de ira muestran problemas en atención ejecutiva y en la capacidad para inhibir una respuesta dominante y/o activar una respuesta subdominante, que podrían influir en la valoración de la congruencia emocional. El objetivo de esta investigación fue analizar la relación entre los niveles de ira asociada a la agresividad y la valoración de la congruencia emocional (escenas y expresiones faciales). Se utilizó el International Affective Picture System para seleccionar las escenas de contenido emocional (Positivas, Violentas, Repulsivas, Neutras), y el NimStim Face Stimulus Set para seleccionar las expresiones faciales emocionales (Alegría, Ira, Miedo, Asco, Neutra). Además, se utilizó el cuestionario de agresividad de Buss y Perry (1992) para determinar los niveles de ira (menor, medio, mayor). Los sujetos con mayores niveles de ira (vs. menores) mostraron tiempos de respuesta más amplios en la valoración de la congruencia emocional entre imágenes y expresiones faciales. El tamaño del efecto fue grande y mayor en las congruencias negativas (r > .50) que en la positiva (r = .33). Estos resultados podrían explicarse por las dificultades en el control atencional de los sujetos del grupo con mayores niveles de ira, especialmente cuando la información que se procesa es negativa.

Similar Papers
  • Research Article
  • Cite Count Icon 44
  • 10.1162/jocn_a_00734
Selective attention modulates early human evoked potentials during emotional face-voice processing.
  • Apr 1, 2015
  • Journal of Cognitive Neuroscience
  • Hao Tam Ho + 2 more

Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.

  • Book Chapter
  • 10.1007/978-3-319-96074-6_57
Seat Comfort Evaluation Using Face Recognition Technology
  • Aug 5, 2018
  • Flavia Renata Dantas Alves Silva Ciaccia + 5 more

One of the difficulties inherent to comfort assessment is to translate comfort perception into quantifiable variables in order to measure it and use this result to improve seat comfort. This study describes the opportunities of using facial expressions recognition technology to compare comfort perception of two aircraft seats installed in a representative environment. Facial expressions are one of the most apparent ways to capture emotions and it is known that there are six basic emotions which are universal throughout human cultures: fear, disgust, anger, surprise, happiness and sadness. Twenty-one subjects (18 males and 3 females) participated in this experiment and have their faces recorded while using the seats and being asked some questions. The recordings obtained were posteriorly analyzed by Emotion Research Lab facial recognition technology to obtain an emotional analysis of the facial expressions displayed by the participants during the experiment. The facial expressions recognition software of Emotion Research Lab captures the facial micro expressions and uses them to predict the behavior of the participants through the calculation of different metrics such as activation, engagement, satisfaction, valence, relevance and enjoyment. The results showed that seat 1 was better rated by participants and had emotional congruence with their answers. The most important finding was that even subtle differences in seats could be perceived in participants’ emotions, suggesting that the use of facial expressions recognition technology to compare comfort perception of aircraft seats is viable and should be better explored during seat development process.

  • Research Article
  • 10.1016/j.biopsycho.2025.109072
The influence of facial expression absence on the recognition of different emotions: Evidence from behavioral and event-related potentials studies.
  • Jul 1, 2025
  • Biological psychology
  • Juan Song + 3 more

The influence of facial expression absence on the recognition of different emotions: Evidence from behavioral and event-related potentials studies.

  • Research Article
  • Cite Count Icon 1
  • 10.1027/0269-8803/a000290
In Identifying the Source of the Incongruent Effect
  • Oct 14, 2021
  • Journal of Psychophysiology
  • Tingji Chen + 3 more

Abstract. Emotional signals from the face and body are normally perceived as an integrated whole in everyday life. Previous studies have revealed an incongruent effect which refers to distinctive behavioral and neural responses to emotionally congruent versus incongruent face-body compounds. However, it remains unknown which kind of the face-body compounds caused the incongruence effect. In the present study, we added neutral face and neutral body stimuli to form new face-body compounds. Forty subjects with normal or corrected-to-normal vision participated in this experiment. By comparing the face-body compounds with emotional conflict and face-body compounds with neutral stimuli, we could investigate the source of the incongruent effect. For both behavioral and event-related potential (ERP) data, a 2 (bodily expression: happiness, fear) × 2 (congruence: congruent, incongruent) repeated-measure analysis of variance (ANOVA) was performed to re-investigate the incongruent effect and a 3 (facial expression: fearful, happy, neutral) × 3 (bodily expression: fearful, happy, neutral) repeated-measure ANOVA was performed to clarify the source of the incongruent effect. As expected, both behavioral and ERP results have successfully repeated the incongruent effect. Specifically, the behavioral data showed that emotionally congruent versus incongruent face-body compounds were recognized more accurately ( p < .05). The ERP component of N2 was modulated by the emotional congruency between the facial and bodily expression showing that the emotionally incongruent compounds elicited greater N2 amplitudes than emotionally congruent compounds ( p < .05). No incongruent effect was found for P1 or P3 component ( p = .079, p = .99, respectively). Furthermore, by comparing the emotionally incongruent pairs with the neutral baseline, the present study suggests that the source of the incongruent effect might be from the happy face-fearful body compounds. We speculate that the emotion expressed by the fearful body was much more intensive than the emotion expressed by the happy body and thus caused a stronger interference in judging the facial expressions.

  • Research Article
  • Cite Count Icon 2
  • 10.1016/j.biopsycho.2023.108611
Control over emotional facial expressions: Evidence from facial EMG and ERPs in a Stroop-like task
  • Jun 9, 2023
  • Biological Psychology
  • Qiang Xu + 2 more

Control over emotional facial expressions: Evidence from facial EMG and ERPs in a Stroop-like task

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 39
  • 10.3389/fpsyg.2017.02175
Facial Expressions in Context: Electrophysiological Correlates of the Emotional Congruency of Facial Expressions and Background Scenes.
  • Dec 12, 2017
  • Frontiers in Psychology
  • Qiang Xu + 3 more

Facial expressions can display personal emotions and indicate an individual’s intentions within a social situation. They are extremely important to the social interaction of individuals. Background scenes in which faces are perceived provide important contextual information for facial expression processing. The purpose of this study was to explore the time course of emotional congruency effects in processing faces and scenes simultaneously by recording event-related potentials (ERPs). The behavioral results found that the categorization of facial expression was faster and more accurate when the face was emotionally congruent than incongruent with the emotion displayed by the scene. In ERPs the late positive potential (LPP) amplitudes were modulated by the emotional congruency between faces and scenes. Specifically, happy faces elicited larger LPP amplitudes within positive than within negative scenes and fearful faces within negative scenes elicited larger LPP amplitudes than within positive scenes. The results did not find the scene effects on the P1 and N170 components. These findings indicate that emotional congruency effects could occur in late stages of facial expression processing, reflecting motivated attention allocation.

  • Research Article
  • 10.1111/jcal.70155
When Do Teachers' Pleasant Expressions in Video Lectures Facilitate Learning? The Role of Emotional Learning Materials and Auditory Emotions
  • Nov 27, 2025
  • Journal of Computer Assisted Learning
  • Fangfang Zhu + 5 more

Background Emotional cues in video lectures have demonstrated complex effects on learning, particularly regarding teachers' facial expressions. However, these effects remain inconclusive, necessitating further exploration of potential factors to enhance learning. Objectives This study examined how three forms of emotional design—learning materials, teachers' facial expressions and teachers' auditory emotions, individually and jointly influence learners' emotional responses, cognitive processing and learning outcomes in video‐based instruction. Methods Across two experiments, we investigated the independent and interactive effects of teachers' facial expressions, the emotional design of learning materials and teachers' auditory emotion on students' emotions, motivation, attention, cognitive load and learning outcomes. Experiment 1 examined the interaction between teachers' facial expressions and emotionally designed learning materials, while Experiment 2 built on these findings to test whether congruent positive facial and auditory cues further enhance students' emotional, motivational and cognitive engagement. Results In Experiment 1, when learning materials were neutrally designed, teachers' pleasant facial expressions reduced extraneous cognitive load and improved learning outcomes. Experiment 2 showed that pairing pleasant facial expressions with pleasant auditory emotion elicited more positive emotions, higher motivation, increased germane load and better learning outcomes. Eye‐tracking analyses indicated that this emotional congruence decreased attentional distraction, highlighting the synergistic benefits of combining visual and auditory emotional cues. Conclusions The study identifies the synergistic effects of various emotional design elements in video lectures on students' learning and contributes to theories of emotional design and cognitive processing in multimedia learning contexts. It also offers practical insights for educators on optimising emotional cues in video‐based learning environments.

  • Research Article
  • Cite Count Icon 17
  • 10.1016/j.biopsycho.2022.108405
Emotional violation of faces, emojis, and words: Evidence from N400
  • Aug 4, 2022
  • Biological Psychology
  • Linwei Yu + 6 more

Emotional violation of faces, emojis, and words: Evidence from N400

  • Research Article
  • 10.1080/13803395.2024.2391362
Happy facial emotional congruence in patients with relapsing-remitting multiple sclerosis
  • Aug 8, 2024
  • Journal of Clinical and Experimental Neuropsychology
  • Pauline Gury + 6 more

Background Emotion categorization has often been studied in the relapsing-remitting form of multiple sclerosis (RR-MS), suggesting an impairment in the recognition of emotions. The production of facial emotional expressions in RR-MS has not been considered, despite their importance in non-verbal communication. Method Twenty-five RR-MS patients and twenty-five matched controls completed a task of emotional categorization during which their faces were filmed. The stimuli were dynamic (sound or visual), expressed by adults (women or men), and expressing happy (laughing or smiling) or negative emotion. Two independent blinded raters quantified the happy facial expressions produced. The categorization task was used as a proxy for emotional categorization, while the happy facial expressions produced assessed the production of emotions. Results The main analysis indicated impaired categorization of RR-MS for happy stimuli selectively, whereas their happy facial expressions were not statistically different from those of the control group. More specifically, this group effect was found for smiles (and not laughter) and for happy stimuli produced by men. Analysis of individual patient profiles suggested that 77% of patients with impaired judgments produced normal facial expressions, suggesting a high prevalence of this dissociation. Only 8% of our samples showed reverse dissociation, with happy facial expressions significantly different from those of the control group and normal emotional judgments. Conclusion These results corroborated the high prevalence of emotional categorization impairment in RR-MS but not for negative stimuli, which can probably be explained by the methodological specificities of the present work. The unusual impairment found for happy stimuli (for both emotional categorization and facial congruence) may be linked to the intensity of the perceived happy expressions but not to the emotional valence. Our results also indicated a mainly preserved production of facial emotions, which may be used in the future sociocognitive care of RR-MS patients with impaired emotional judgments.

  • Dissertation
  • 10.4226/66/5a8e4b0e4b7a1
The emotional congruence of experience and bodily change
  • May 26, 2016
  • Matthew C Reeder

This study examined the association of the experience of emotion and somatic changes. The study compared reported somatic changes generally experienced when anxious with the actual association of the experience of emotion and somatic changes as measured during a specific event. Emotions were measured as both general negative emotion as well as specific emotions: anger, disgust, fear, sadness and shame. Participants were volunteers from a Victorian university who agreed to watch a video depicting the dramatisation of child abuse. Throughout the video, participants indicated their experience of emotion. Measures were also taken throughout the procedure of facial expression and Galvanic Skin Response (GSR). In order to examine emotional-congruence, subjects were divided into three groups. These groups were divided according to the congruence of subjects' experienced emotion with autonomic changes and facial expressivity. Groups were divided separately for each of the emotion types. Where there was little difference between the reported experience of emotion and that, which would have been expected from the observed somatic changes, the subject was deemed to be in the Congruent Group. Subjects whose reported experience of emotion was greater or less than would be expected from observed somatic changes were allocated to the Over-reporter and Under-Reporter Groups respectively. This data was then compared to participants' reports of the number of somatic symptoms usually experienced when anxious. It was found that participants who under-report the experience of general negative-emotion compared with their observed somatic changes (both GSR and facial expressivity) had lower trait-somatic-anxiety (reported fewer somatic symptoms usually experienced when anxious). There was no significant difference between the Congruent Group and Over-Reporter Group.;The Under-Reporter Groups had significantly lower trait-somatic-anxiety than the Congruent Group when emotional-congruence was defined by fear and GSR, anger and GSR and sadness and facial expressivity. The actual association of shame and disgust with either somatic change, sadness with autonomic change and anger and fear with facial expressivity was unrelated to the number of somatic symptoms reported to be usually experienced when anxious. The results supported the idea that subjective reports of the number of somatic symptoms reported to be usually experienced when anxious reflect the actual association of somatic change and experience, but with limitations. The actual association of experience of fear with autonomic change seems to reflect the number of somatic symptoms reported to be usually experienced when anxious more than other emotions. Further for those for whom the experience of anger and negative-emotion has a greater association with somatic change, there was a greater number of somatic symptoms reported to be usually experienced when anxious. This would suggest that some people have a greater association of some experiences of emotion and somatic change. Furthermore, while there is an association between reported somatic changes generally experienced when anxious with the actual association of the experience of emotion and somatic changes as measured during a specific event, this was dependant on the association of the emotion types rather than being generalised for all emotions.

  • Research Article
  • Cite Count Icon 30
  • 10.1016/j.actpsy.2018.04.013
Effects of affective and emotional congruency on facial expression processing under different task demands
  • May 8, 2018
  • Acta Psychologica
  • Luis Aguado + 4 more

Effects of affective and emotional congruency on facial expression processing under different task demands

  • Research Article
  • 10.1093/jmt/thaf015
Reciprocal Communication Training Through Music (RCTM) for Autistic Children.
  • Aug 1, 2025
  • Journal of music therapy
  • Hayoung A Lim + 4 more

Challenges in social responsiveness and social communicative behaviors are often observed in autistic children. It is imperative to develop effective treatment methods to enhance social communication and reciprocity in autistic children. This study examines the efficacy of two particular treatment methods to improve social communication including Reciprocal Communication Training (RCT) and Reciprocal Communication Training through Music (RCTM). Ten autistic children participated in this study and engaged in musical and nonmusical interventions that addressed greeting, receptive communication, imitation, initiation, and emotional congruence with facial expression, emotion identification, and emotional attunement. To analyze the impact of these interventions, the study included dependent samples t-tests to explore the differences in reciprocal communicative behaviors of autistic children between RCT and RCTM. A paired t-test analysis indicated that there were significant differences between RCT and RCTM on greeting, imitating behavior, initiating behavior, and emotional (happy vs. sad) congruence. The results indicated that participants who underwent RCTM demonstrated enhanced reciprocal communicative skills, particularly evident in the participants' improved greeting and imitation behaviors. This improvement was observed across both early and late intervention stages. Moreover, the study suggests that RCTM had a positive influence on various aspects of reciprocal and affective communication, optimizing the effects of music to create a sensory-rich environment for enhanced engagement. RCTM emerges as a promising method for fostering social communication skills in autistic children, offering potential benefits for their educational and therapeutic outcomes.

  • Research Article
  • Cite Count Icon 133
  • 10.1073/pnas.0509179102
Unconscious fear influences emotional awareness of faces and voices.
  • Dec 13, 2005
  • Proceedings of the National Academy of Sciences
  • B De Gelder + 2 more

Nonconscious recognition of facial expressions opens an intriguing possibility that two emotions can be present together in one brain with unconsciously and consciously perceived inputs interacting. We investigated this interaction in three experiments by using a hemianope patient with residual nonconscious vision. During simultaneous presentation of facial expressions to the intact and the blind field, we measured interactions between conscious and nonconsciously recognized images. Fear-specific congruence effects were expressed as enhanced neuronal activity in fusiform gyrus, amygdala, and pulvinar. Nonconscious facial expressions also influenced processing of consciously recognized emotional voices. Emotional congruency between visual and an auditory input enhances activity in amygdala and superior colliculus for blind, relative to intact, field presentation of faces. Our findings indicate that recognition of fear is mandatory and independent of awareness. Most importantly, unconscious fear recognition remains robust even in the light of a concurrent incongruent happy facial expression or an emotional voice of which the observer is aware.

  • Research Article
  • Cite Count Icon 17
  • 10.1521/pedi_2015_29_219
Motor Empathy in Individuals With Psychopathic Traits: A Preliminary Study.
  • Jul 13, 2015
  • Journal of Personality Disorders
  • Yelena Khvatskaya + 1 more

The present laboratory study examined motor empathy in male and female individuals, who were either high or low on psychopathic traits, drawn from a nonclinical university population. Past findings suggest that psychopathic individuals are impaired in affective empathy, but findings on impairments in cognitive empathy are mixed. Research on motor empathy in psychopathy is scarce. The authors hypothesized that individuals high on psychopathic traits would have deficient motor empathy (similar to affective empathy) related to valenced emotion stimuli because of the automatic nature of motor empathy. Potential participants completed the Psychopathic Personality Inventory-Revised (PPI-R). Participants were chosen for the study on the basis of their PPI-R scores. All participants viewed photographic images drawn from a well-established set of stimuli (the International Affective Picture System) and were video recorded while doing so. Intensity for eight emotions (anger, contempt, disgust, fear, sad, joy, surprise, and neutral) in participants' facial expressions was measured objectively using an automated program, the Computer Expression Recognition Toolbox. Individuals high on psychopathic traits as compared with low PPI-R scorers displayed significantly less emotional congruence when viewing negative images. The study results suggest that deficits in motor empathy related to psychopathic trait levels are relatively restricted to negative emotions.

  • Research Article
  • Cite Count Icon 6
  • 10.3758/s13415-021-00890-0
Does gaze direction of fearful faces facilitate the processing of threat? An ERP study of spatial precuing effects.
  • Apr 12, 2021
  • Cognitive, Affective, & Behavioral Neuroscience
  • Jinbo Zhang + 3 more

Eye gaze is very important for attentional orienting in social life. By adopting the event-related potential (ERP) technique, we explored whether attentional orienting of eye gaze is modulated by emotional congruency between facial expressions and the targets in a spatial cuing task. Faces with different emotional expressions (fearful/angry/happy/neutral) directing their eye gaze to the left or right were used as cues, indicating the possible location of subsequent targets. Targets were line drawings of animals, which could be either threatening or neutral. Participants indicated by choice responses whether the animal would fit inside a shoebox in real life or not. Reaction times to targets were faster after valid compared with invalid cues, showing the typical eye gaze cuing effect. Analyses of the late positive potential (LPP) elicited by targets revealed a significant modulation of the gaze cuing effect by emotional congruency. Threatening targets elicited larger LPPs when validly cued by gaze in faces with negative (fearful and angry) expressions. Similarly, neutral targets showed larger LPPs when validly cued by faces with neutral expressions. Such effects were not present after happy face cues. Source localization in the LPP time window revealed that for threatening targets, the activity of right medial frontal gyrus could be related to a larger gaze-orienting effect for the fearful than the angry condition. Our findings provide electrophysiological evidence for the modulation of gaze cuing effects by emotional congruency.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.