Right-hemisphere lateralisation evidenced from the chimeric face task predicts self-reported social competencies.
Observing and understanding faces is a critical component of social interactions. The neural correlates of face processing have been well established to be preferentially lateralised to the right hemisphere, though the functional role of this brain asymmetry has received less attention. Here we investigated the hypothesis that a left-visual-field (right-hemisphere) bias in face perception would be associated with a broader set of self-reported social competencies. Participants (n = 348) completed a chimeric face task, requiring judgements of which side of chimeric face stimuli were more emotional, a face emotion recognition task, and the Multidimensional Social Competence Scale in an online experiment. Overall social competencies were predicted by degree of chimeric face task bias to the right hemisphere (determined as the laterality quotient, LQ). Structural Equation Model analyses revealed that social inferencing and non-verbal sending skills were best predicted by LQ. In all analyses the predictive role of LQ was independent of face emotion recognition. Social cognition has previously been linked to the right hemisphere, but we report a novel relationship between lateralisation of face processing, and aspects of social competencies that encompass both understanding and the display of social cues.
- Research Article
- 10.1186/s40359-024-02218-2
- Mar 18, 2025
- BMC Psychology
BackgroundRight-hemisphere brain regions are strongly implicated in facial emotion processing (FEP), a phenomenon termed right-hemispheric bias. Variability in FEP hemispheric bias is thought to underpin differences in facial emotion recognition ability and has been associated with age, handedness, biological sex, and autistic traits. However, findings from research to date investigating factors associated with FEP hemispheric bias have been inconsistent.ObjectiveTo examine if FEP hemispheric bias can be predicted by individual factors such as age, biological sex, handedness, and autistic traits.Methods427 adults recruited from the general population aged 18–67 years completed the Autism-spectrum Quotient. We also assessed covariates previously linked with FEP hemispheric bias including age, handedness, and biological sex. FEP hemispheric bias was indexed using laterality quotients calculated from a Chimeric Faces Task, where participants indicated which of two identical (but mirrored) half-emotional half-neutral (no emotion) chimeric faces were more emotive.ResultsLinear regression models revealed that (1) handedness predicted FEP hemispheric choice bias, (2) the attention switching Autism-spectrum Quotient subscale predicted FEP hemispheric reaction time bias, and (3) the imagination Autism-spectrum Quotient subscale predicted FEP hemispheric reaction time bias for males, but not females.ConclusionsThese findings indicate that the relationship between autistic traits and FEP hemispheric bias is nuanced. Additionally, handedness influences hemispheric bias effects during FEP. Future research should endeavour to investigate if FEP hemispheric bias is dependent on the emotion being observed and consider using more direct measures of hemispheric bias.
- Research Article
145
- 10.1038/npp.2013.254
- Sep 26, 2013
- Neuropsychopharmacology
The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS.
- Research Article
- 10.1080/1357650x.2024.2377633
- Jul 3, 2024
- Laterality
Facial emotion processing (FEP) tends to be right hemisphere lateralized. This right-hemispheric bias (RHB) for FEP varies within and between individuals. The aim of the present research was to examine evidence pertaining to the prominent theories of FEP hemispheric bias as measured by a half-emotional half-neutral (no emotion) chimeric faces task. FEP hemispheric bias was indexed using laterality quotients (LQs) calculated from a Chimeric Faces Task completed by 427 adults recruited from the general population aged 18–67 years. Participants indicated which of two identical (but mirrored) emotional-neutral chimeric faces were more emotive. While all investigated emotions (fear, anger, and happiness) were right lateralized, fear was significantly more right lateralized than anger and happiness. These results provide evidence for both the right hemisphere hypothesis and the motivational hypothesis of emotion perception.
- Research Article
93
- 10.1016/j.psychres.2010.06.027
- Jul 18, 2010
- Psychiatry Research
Impairments of facial emotion recognition and theory of mind in methamphetamine abusers
- Research Article
- 10.1371/journal.pone.0339713
- Jan 1, 2026
- PloS one
Individuals diagnosed with schizophrenia experience cognitive impairments, including a decline in social cognition, which encompasses facial emotion recognition. Facial emotion recognition is an important aspect of social interaction, guiding people's actions and influencing their social functioning. Early childhood experiences, such as parental attachment, are one of the most influential factors in the development of many psychiatric symptoms including impairment of social cognition. Our aim was to explore this poorly researched area. We investigate the hypothesis that dysfunctional parenting styles negatively affect facial emotion recognition abilities in general and further worsen these deficits in individuals diagnosed with schizophrenia. A total of 32 participants were included in an exploratory study, comprising 16 patients with paranoid schizophrenia recruited from forensic clinics in Switzerland, and 16 age and education matched healthy controls without history of psychiatric or neurological illnesses. Parental attachment was assessed using the Parental Bonding Inventory and subjects were assigned to subgroups of optimal vs. neglectful parenting style from both parental sides. Facial emotion recognition was operationalized as the error rate in an emotion-naming task using standardized images of the five basic emotions. Overall, schizophrenia patients made significantly more errors in the facial emotion recognition task than healthy controls. Interestingly, in the subgroup with optimal parental attachment experiences, patients did not significantly differ from controls, whereas in cases of neglectful parenting, the patients showed a much higher error rate in facial emotion recognition compared to healthy controls (p < .001), as well as compared to patients with optimal parenting experience (p < .01). Neglectful parenting appears to exacerbate the adverse effects of schizophrenia on facial emotion recognition; i.e., optimal parenting might mitigate deficits caused by schizophrenia spectrum disorder patients and help compensate for FER impairments.
- Research Article
- 10.1371/journal.pone.0339713.r004
- Feb 10, 2026
- PLOS One
Individuals diagnosed with schizophrenia experience cognitive impairments, including a decline in social cognition, which encompasses facial emotion recognition. Facial emotion recognition is an important aspect of social interaction, guiding people’s actions and influencing their social functioning. Early childhood experiences, such as parental attachment, are one of the most influential factors in the development of many psychiatric symptoms including impairment of social cognition. Our aim was to explore this poorly researched area. We investigate the hypothesis that dysfunctional parenting styles negatively affect facial emotion recognition abilities in general and further worsen these deficits in individuals diagnosed with schizophrenia. A total of 32 participants were included in an exploratory study, comprising 16 patients with paranoid schizophrenia recruited from forensic clinics in Switzerland, and 16 age and education matched healthy controls without history of psychiatric or neurological illnesses. Parental attachment was assessed using the Parental Bonding Inventory and subjects were assigned to subgroups of optimal vs. neglectful parenting style from both parental sides. Facial emotion recognition was operationalized as the error rate in an emotion-naming task using standardized images of the five basic emotions. Overall, schizophrenia patients made significantly more errors in the facial emotion recognition task than healthy controls. Interestingly, in the subgroup with optimal parental attachment experiences, patients did not significantly differ from controls, whereas in cases of neglectful parenting, the patients showed a much higher error rate in facial emotion recognition compared to healthy controls (p < .001), as well as compared to patients with optimal parenting experience (p < .01). Neglectful parenting appears to exacerbate the adverse effects of schizophrenia on facial emotion recognition; i.e., optimal parenting might mitigate deficits caused by schizophrenia spectrum disorder patients and help compensate for FER impairments.
- Abstract
- 10.1093/schbul/sbaa029.617
- May 1, 2020
- Schizophrenia Bulletin
BackgroundSchizophrenia patients and individuals at ultra-high risk for psychosis(UHR) have shown impaired facial emotion recognition(FER). Previous studies have reported lower accuracy and negative bias of FER in schizophrenia and UHR. These impairments have been studied with various factors such as schizotypy and paranoid level, but the results were inconsistent. This study aimed to identify the impairments of FER in UHR individuals and further to examine how these impairments relate to schizotypy and paranoid level.Methodsorty-three UHR individuals and 57 normal controls (NC) were requested to perform the facial emotion recognition(FER) task that consist of 60 facial photographs selected from standardized photographs of Ekman and Friesen series. For exploratory correlation analysis, schizotypy (Revised physical anhedonia scale, Magical ideation scale) and paranoid level (Paranoia scale, Persecution/suspicious item of Positive and Negative Syndrome Scale) were also examined in UHR individuals.ResultsThe UHR individuals showed lower accuracy rate for total FER task (70.6% vs. 75.6%, p=0.010) and more “fear” responses for neutral faces (14.5% vs. 6.0%, p=0.003) than NC. In exploratory correlation analysis for UHR individuals, the total accuracy rate of FER task showed significant correlation with both scales for schizotypy, but not with both scales for paranoid level. Among threat-related emotion response rates for neutral face, only “Disgust” response rate for neutral face was correlated with all scales for paranoid level, but not with scales for schizotypys in UHR individuals.DiscussionIn this study, we could identify inaccuracy and negative bias of FER in UHR individuals, Furthermore, we found that inaccuracy and negative bias were associated with schizotypy and paranoid level, respectively. These findings imply that inaccuracy and negative bias of FER in UHR individuals are of different nature. Future studies on the clinical implications of these findings would be needed.
- Research Article
4
- 10.1093/scan/nsae013
- Feb 8, 2024
- Social Cognitive and Affective Neuroscience
The role of facial feedback in facial emotion recognition remains controversial, partly due to limitations of the existing methods to manipulate the activation of facial muscles, such as voluntary posing of facial expressions or holding a pen in the mouth. These procedures are indeed limited in their control over which muscles are (de)activated when and to what degree. To overcome these limitations and investigate in a more controlled way if facial emotion recognition is modulated by one’s facial muscle activity, we used computer-controlled facial neuromuscular electrical stimulation (fNMES). In a pre-registered EEG experiment, ambiguous facial expressions were categorised as happy or sad by 47 participants. In half of the trials, weak smiling was induced through fNMES delivered to the bilateral Zygomaticus Major muscle for 500 ms. The likelihood of categorising ambiguous facial expressions as happy was significantly increased with fNMES, as shown with frequentist and Bayesian linear mixed models. Further, fNMES resulted in a reduction of P1, N170 and LPP amplitudes. These findings suggest that fNMES-induced facial feedback can bias facial emotion recognition and modulate the neural correlates of face processing. We conclude that fNMES has potential as a tool for studying the effects of facial feedback.
- Research Article
15
- 10.3390/ijerph16245125
- Dec 1, 2019
- International Journal of Environmental Research and Public Health
Autism spectrum disorder (ASD) is a neurodevelopmental disorder that is characterized by impaired social interaction, communication and restricted and repetitive behavior. Few studies have focused on the effect of facial emotion recognition on bullying involvement among individuals with ASD. The aim of this study was to examine the association between facial emotion recognition and different types of bullying involvement in adolescents with high-functioning ASD. We recruited 138 adolescents aged 11 to 18 years with high-functioning ASD. The adolescents’ experiences of bullying involvement were measured using the Chinese version of the School Bullying Experience Questionnaire. Their facial emotion recognition was measured using the Facial Emotion Recognition Task (which measures six emotional expressions and four degrees of emotional intensity). Logistic regression analysis was used to examine the association between facial emotion recognition and different types of bullying involvement. After controlling for the effects of age, gender, depression, anxiety, inattention, hyperactivity/impulsivity and opposition, we observed that bullying perpetrators performed significantly better on rating the intensity of emotion in the Facial Emotion Recognition Task; bullying victims performed significantly worse on ranking the intensity of facial emotion. The results of this study support the different deficits of facial emotion recognition in various types of bullying involvement among adolescents with high-functioning ASD. The different directions of association between bully involvement and facial emotion recognition must be considered when developing prevention and intervention programs.
- Research Article
6
- 10.3389/fpsyt.2021.622077
- Jun 9, 2021
- Frontiers in Psychiatry
While culture and depression influence the way in which humans process emotion, these two areas of investigation are rarely combined. Therefore, the aim of this study was to investigate the difference in facial emotion recognition among Malaysian Malays and Australians with a European heritage with and without depression. A total of 88 participants took part in this study (Malays n = 47, Australians n = 41). All participants were screened using The Structured Clinical Interview for DSM-5 Clinician Version (SCID-5-CV) to assess the Major Depressive Disorder (MDD) diagnosis and they also completed the Beck Depression Inventory (BDI). This study consisted of the facial emotion recognition (FER) task whereby the participants were asked to look at facial images and determine the emotion depicted by each of the facial expressions. It was found that depression status and cultural group did not significantly influence overall FER accuracy. Malaysian participants without MDD and Australian participants with MDD performed quicker as compared to Australian participants without MDD on the FER task. Also, Malaysian participants more accurately recognized fear as compared to Australian participants. Future studies can focus on the extent of the influence and other aspects of culture and participant condition on facial emotion recognition.
- Research Article
- 10.1177/14771535241275481
- Sep 16, 2024
- Lighting Research & Technology
Road lighting should support the needs of pedestrians to make interpersonal evaluations after dark, for example, whether it feels safe to walk towards the person ahead or if avoiding action should be taken. In previous studies this has been investigated using a facial emotion recognition (FER) task, but using only full-face views of the person ahead. Other views are possible and this might affect the ability to make FER judgements and thus the impact of changes in light level. Reported here are the results of an experiment investigating FER with full-face and 3/4 views. The results show that while correct expression recognition is reduced with the 3/4 view, there is no interaction between face luminance and face view.
- Research Article
4
- 10.1111/ejn.13976
- Jul 1, 2018
- European Journal of Neuroscience
Working memory-based cognitive remediation therapy (CT) for psychosis has recently been associated with broad improvements in performance on untrained tasks measuring working memory, episodic memory and IQ, and changes in associated brain regions. However, it is unclear whether these improvements transfer to the domain of social cognition and neural activity related to performance on social cognitive tasks. We examined performance on the Reading the Mind in the Eyes test (Eyes test) in a large sample of participants with psychosis who underwent working memory-based CT (N=43) compared to a control group of participants with psychosis (N=35). In a subset of this sample, we used functional magnetic resonance imaging (fMRI) to examine changes in neural activity during a facial emotion recognition task in participants who underwent CT (N=15) compared to a control group (N=15). No significant effects of CT were observed on Eyes test performance or on neural activity during facial emotion recognition, either at p<0.05 family-wise error or at a p<0.001 uncorrected threshold, within a priori social cognitive regions of interest. This study suggests that working memory-based CT does not significantly impact an aspect of social cognition which was measured behaviourally and neurally. It provides further evidence that deficits in the ability to decode mental state from facial expressions are dissociable from working memory deficits, and suggests that future CT programmes should target social cognition in addition to working memory for the purposes of further enhancing social function.
- Research Article
50
- 10.3389/fpsyg.2013.00376
- Jun 26, 2013
- Frontiers in Psychology
Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions.
- Research Article
13
- 10.3109/00207458908986221
- Jan 1, 1989
- The International journal of neuroscience
Mechanisms underlying hemispace biases for free-field judgments of emotional intensity in chimeric faces were explored. The Levy et al. chimeric faces task (1983b) was examined in relationship to relevant neuropsychological measures (emotional, imaginal, ocular). Forty-four normal adults were administered a test battery including measures of chimeric face perception, lateral eye movements to nonemotional and emotional instructions, image generation, and ocular dominance ("eyedness"). Overall, subjects showed a significant left-sided bias for judging chimeric faces and for producing lateral eye movements to emotional instructions. Asymmetries for chimeric face perception were significantly correlated with asymmetries for the location of self-generated images in space. When task modalities were examined, there was a specific relationship between chimeric face perception and tactile processing on the other neuropsychological measures.
- Research Article
- 10.2147/nss.s462540
- Oct 1, 2024
- Nature and science of sleep
To investigate the effects of sleep quality, sleep deprivation, and napping on facial emotion recognition (FER) accuracy and speed. This research included a cross-sectional study (102 qualified participants) and a randomized controlled study (26 in the napping group and 24 in the control group). The stimuli for the FER task were obtained from the Chinese Facial Affective Picture System (CFAPS). Four facial expressions (fearful, disgusted, sad, and angry) were used. The Pittsburgh Sleep Quality Index (PSQI), Self-Rating Anxiety Scale, and Self-Rating Depression Scale were used to measure participants' sleep quality and psychological conditions. In Study 1, FER ability was compared between good and poor sleepers. In Study 2, all participants were sleep-deprived for one night, and completed the FER task before and after sleep deprivation. After different interventions (ie, napping for one hour, or walking around for ten minutes), the participants completed the third FER task. Study 1: Poor sleepers were able to recognize sad expressions more accurately compared with good sleepers. Study 2: 30-h sleep deprivation had no significant effect on the accuracy (ACC). Napping after sleep deprivation improved the FER ACC of upper-face expressions and marginally significantly improved the FER ACC of disgusted expressions. Better sleep quality was linked to lower FER accuracy, particularly in recognizing sad expressions, while no significant differences in recognition speed were observed. Additionally, 30hours of sleep deprivation did not affect FER accuracy, but napping after sleep deprivation improved accuracy for upper-face and marginally for disgusted expressions.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.