• All Solutions All Solutions Caret
    • Editage

      One platform for all researcher needs

    • Paperpal

      AI-powered academic writing assistant

    • R Discovery

      Your #1 AI companion for literature search

    • Mind the Graph

      AI tool for graphics, illustrations, and artwork

    • Journal finder

      AI-powered journal recommender

    Unlock unlimited use of all AI tools with the Editage Plus membership.

    Explore Editage Plus
  • Support All Solutions Support
    discovery@researcher.life
Discovery Logo
Sign In
Paper
Search Paper
Cancel
Pricing Sign In
  • My Feed iconMy Feed
  • Search Papers iconSearch Papers
  • Library iconLibrary
  • Explore iconExplore
  • Ask R Discovery iconAsk R Discovery Star Left icon
  • Chat PDF iconChat PDF Star Left icon
  • Chrome Extension iconChrome Extension
    External link
  • Use on ChatGPT iconUse on ChatGPT
    External link
  • iOS App iconiOS App
    External link
  • Android App iconAndroid App
    External link
  • Contact Us iconContact Us
    External link
Discovery Logo menuClose menu
  • My Feed iconMy Feed
  • Search Papers iconSearch Papers
  • Library iconLibrary
  • Explore iconExplore
  • Ask R Discovery iconAsk R Discovery Star Left icon
  • Chat PDF iconChat PDF Star Left icon
  • Chrome Extension iconChrome Extension
    External link
  • Use on ChatGPT iconUse on ChatGPT
    External link
  • iOS App iconiOS App
    External link
  • Android App iconAndroid App
    External link
  • Contact Us iconContact Us
    External link

Related Topics

  • Facial Affect Recognition
  • Facial Affect Recognition

Articles published on Facial Emotion Recognition

Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
2840 Search results
Sort by
Recency
  • New
  • Research Article
  • 10.20961/ijie.v9i2.110565
Real-Time Emotion Recognition in Online Learning Using Google Teachable
  • Dec 31, 2025
  • IJIE (Indonesian Journal of Informatics Education)
  • Nazli Rahmeisi

Understanding learners’ emotional engagement in e-learning environments remains challenging due to the limited availability of non-verbal cues, despite its importance for motivation and participation. This paper proposes a facial emotion recognition approach using Google's Teachable Machine to support real-time emotion detection within online learning environments. The system analyzes facial expressions captured through a standard webcam to classify four basic emotional states: happy, sad, neutral, and angry. An experimental design was employed using simulated emotional expressions collected under controlled conditions, including adequate lighting and front-facing facial images. The results indicate that the system can provide instructors with additional affective cues to support formative assessment and instructional awareness in synchronous online learning. The proposed approach emphasizes practical instructional feasibility and accessibility compared to more complex emotion recognition models, as it does not require specialized hardware or advanced programming skills.

  • New
  • Research Article
  • 10.1186/s40359-025-03741-6
The effects of posttraumatic stress symptoms on anxiety and depression in patients with breast cancer: mediating role of negative interpretation bias.
  • Dec 30, 2025
  • BMC psychology
  • Miaomiao Wang + 5 more

This study examined the associations between posttraumatic stress symptoms (PTSS), anxiety, depression, and negative interpretation bias in breast cancer patients, with a specific focus on the potential mediating role of negative interpretation bias. Eighty breast cancer patients completed a cross-sectional assessment including the Impact of Event Scale (IES), the Hospital Anxiety and Depression Scale (HADS), and an ambiguous emotional face recognition task. Participants were categorized into PTSS and non-PTSS groups based on IES scores. Data were analyzed using descriptive statistics, independent-samples t-tests, ANOVA, and mediation analyses (PROCESS Model 4, 5,000 bootstraps, 95% CI). Patients with PTSS reported significantly higher levels of anxiety and depression, and demonstrated a stronger tendency to interpret ambiguous emotional expressions in a negative manner. Negative interpretation bias partially mediated the association between PTSS and emotional distress. Breast cancer patients with PTSS tend to experience heightened emotional distress, which may be partly accounted for by negative interpretation bias. Additional psychological and physiological mechanisms may also contribute.

  • New
  • Research Article
  • 10.1111/acps.70060
Digital Social Cognition Training Enhancing Social Cognition in Patients With Schizophrenia: A Systematic Review and Meta-Analysis.
  • Dec 28, 2025
  • Acta psychiatrica Scandinavica
  • Ching-Ming Wang + 4 more

Social cognition deficits, such as impaired emotion recognition, theory of mind (ToM), and empathy are central to schizophrenia-spectrum disorders and predict poor functional outcomes. Conventional therapies often lack scalability. Technology-based social cognition training (TBSCT), using computerized, web-based, or virtual reality (VR) formats, offers an accessible and adaptive solution. Following PRISMA guidelines (PROSPEROCRD420251021242), databases including Embase, MEDLINE, Cochrane, CINAHL, Web of Science, and PsycArticles were searched up to May 2025. Eligible randomized or quasi-experimental studies involved adults with schizophrenia-spectrum disorders receiving TBSCT targeting emotion recognition, ToM, or empathy. Methodological quality was assessed using Joanna Briggs Institute tools. Random-effects meta-analyses estimated pooled effect sizes (Hedges' g), with subgroup analyses by technology type, clinical setting, and intervention focus. Twenty-one studies (17 in meta-analysis) met inclusion criteria. TBSCT significantly enhanced facial emotion recognition (FER; g = 0.92, p < 0.001) and showed a small effect on ToM (g = 0.22, p = 0.05); empathy improved pre-post (g = 0.58, p = 0.04). FER benefits were strongest in web-based (g = 1.35), followed by VR (g = 0.87) and computerized programs (g = 0.75). ToM gains were significant only among outpatients. Methodological quality was moderate to high, with mild, transient adverse effects and comparable dropout rates (risk difference = -0.03, p = 0.21). TBSCT effectively improves social cognition, particularly emotion recognition in schizophrenia-spectrum disorders. The interventions are safe, feasible, and scalable, supporting clinical implementation. Future studies should examine long-term efficacy and optimize engagement across diverse psychiatric populations.

  • New
  • Abstract
  • 10.1002/alz70857_105781
A RoadMap for Neuropsychological Assessment of the Right Temporal Variant of Frontotemporal Dementia (rtvFTD): Case Studies and Practical Applications
  • Dec 25, 2025
  • Alzheimer's & Dementia
  • Loana De Los Santos + 6 more

BackgroundThe right temporal variant of frontotemporal dementia (rtvFTD) is a neurodegenerative condition characterized by progressive atrophy of the right anterior temporal lobe (rATL), significantly impairing semantic‐pragmatic comprehension and social cognition. In Latin America, although magnetic resonance imaging (MRI) and computed tomography (CT) are widely available, there is still a need for neuropsychological tools to assess cognitive and social changes in rtvFTD. Currently, this condition remains a subject of debate due to diagnostic challenges stemming from a lack of consensus in terminology and variability in assessment tools (Ulugut et al., 2024; Younes et al., 2022). The aim of this study is to propose neuropsychological tools to characterize both the profile and cognitive changes of rtvFTD and present a structured roadmap to help differentiate rtvFTD from other dementias. Additionally, this roadmap contributes to the design of personalized therapeutic interventions.MethodTwo clinical cases diagnosed with rtvFTD at FLENI (Buenos Aires, Argentina) were studied. Both patients underwent standard neuropsychological evaluations focused on semantic‐pragmatic language and social cognition, using locally adapted tests for naming, semantic verbal fluency, semantic association, prosody, pragmatics, and speech intentionality. Findings were correlated with MRI scans to validate the proposed roadmap.ResultThe patients exhibited severe deficits in naming, semantic verbal fluency, semantic‐pragmatic impairments, and alterations in emotional prosody, theory of mind, and facial emotion recognition. Executive attentional systems, visuospatial abilities, and memory remained preserved. These findings aligned with patterns of atrophy and hypometabolism observed in the rATL and were consistent with current literature on the neuropsychological and clinical profiles of the rtvFTD. Figure 1 shows the proposed neuropsychological assessment approach, using a regionally adapted cognitive battery designed to capture rtvFTD symptoms in Spanish‐speaking populations and to guide differentiation from other dementia variants.ConclusionThis roadmap provides a practical guide that includes neuropsychological tests for the assessment of rtvFTD, particularly in Spanish‐speaking countries. By integrating evaluations targeting semantic‐pragmatic language and social cognition, the roadmap allows for precise differentiation of rtvFTD from other frontotemporal dementia variants. Furthermore, it contributes to the development of personalized therapeutic interventions, aiming to improve patient quality of life and support clinical practices in Spanish‐speaking regions.

  • New
  • Research Article
  • 10.3390/technologies14010008
Post Hoc Error Correction for Missing Classes in Deep Neural Networks
  • Dec 22, 2025
  • Technologies
  • Andrey A Lebedev + 2 more

This paper presents a novel post hoc error correction method that enables deep neural networks to recognize classes that were completely excluded during training. Unlike traditional approaches requiring full model retraining, our method uses hidden layer representations from any pre-trained classifier to detect and correct errors on missing categories. We demonstrate the approach on facial emotion recognition using the RAF-DB dataset, systematically excluding each of the seven emotion classes from training. The results show correction gains of up to 0.811 for excluded classes while maintaining 99% retention on known classes in the best setup. The method provides a computationally efficient alternative to retraining when new categories emerge after deployment.

  • New
  • Research Article
  • 10.51239/jictra.v16i1.354
Smart Class Analytics for Assessing Group Understandingvia Facial Expressions
  • Dec 20, 2025
  • Journal of Information Communication Technologies and Robotic Applications
  • Haider Ali + 4 more

Understanding the comprehension level of students in a communicating classroom is critical. As technology becomes increasingly integral to our lives, advanced facial emotion recognition becomes a powerful tool for assessing comprehension levels in a real-time scenario. We created a system to measure cognizance levels and analyze facial emotions to ascertain comprehension. In this research, we presented a new system for assessing the comprehension level of a group setting through facial expressions in a real-time. Our techniques, supported by advanced DL techniques like OpenCV and Deep Face provide us with different backend options that are very effective for facial emotion recognition and analysis such as MTCNN and Retina face. Initially, we used OpenCV for real-time capturing video by computer webcam and capture the frames at regular intervals. The frames are stored in a pre-defined directory. After that, we defined a function that iterates over each frame for detecting emotions with facial landmarks and also shows the dominant emotion with each face in the frame. A pretrained MTCNN model (one of the state-of-the-art models provided by Deep Face) has been utilized for detecting the face present in each frame and facial landmark localization. MTCNN is a lightweight model that detects faces with high accuracy and speed. Finally, The Deep Face analyze () function has been utilized for emotion analysis in the wild with the MTCNN backend which provides insights into the dominant emotion related to each face. Comprehension levels were measured as per emotional states detected, with positive affect [‘happy’, ‘neutral’ (when confidence is &gt;70), and ‘surprised’ (when confidence is &lt;= 60%)] implying higher comprehension, negative affect [‘sad’, ‘disgust’, ‘fear’, and ‘surprised when the confidence level is above 60%’] hinting at lower comprehension, and when [‘happy’, ‘neutral’] emotions having confidence &lt;=70% hinting to moderate affect. The measured comprehension level is then disseminated into high, moderate, and low. The finding exposed the complex interrelation among facial expressions, emotions, and comprehension levels in intragroup interactions. This research supports the development of understanding human behavior and interaction dynamics, with connection to domains such as psychology, human-computer interaction, and education.

  • Research Article
  • 10.1016/j.sleep.2025.108731
Facial recognition in late-life insomnia: Preserved positivity bias and associations with negative emotions.
  • Dec 17, 2025
  • Sleep medicine
  • Zilu Zhang + 3 more

Facial recognition in late-life insomnia: Preserved positivity bias and associations with negative emotions.

  • Research Article
  • 10.1007/s44163-025-00553-w
Revolutionizing facial emotion recognition: in-depth analysis of cutting-edge models, methodologies, and datasets
  • Dec 17, 2025
  • Discover Artificial Intelligence
  • Ketan Sarvakar + 1 more

Revolutionizing facial emotion recognition: in-depth analysis of cutting-edge models, methodologies, and datasets

  • Research Article
  • 10.1007/s00371-025-04265-1
Enhancing real-time facial emotion recognition in classrooms via Attention-ResNet optimization
  • Dec 12, 2025
  • The Visual Computer
  • Haoming Wang + 4 more

Enhancing real-time facial emotion recognition in classrooms via Attention-ResNet optimization

  • Research Article
  • 10.3758/s13415-025-01378-x
Right-hemisphere lateralisation evidenced from the chimeric face task predicts self-reported social competencies.
  • Dec 10, 2025
  • Cognitive, affective & behavioral neuroscience
  • Vinh Nguyen + 1 more

Observing and understanding faces is a critical component of social interactions. The neural correlates of face processing have been well established to be preferentially lateralised to the right hemisphere, though the functional role of this brain asymmetry has received less attention. Here we investigated the hypothesis that a left-visual-field (right-hemisphere) bias in face perception would be associated with a broader set of self-reported social competencies. Participants (n = 348) completed a chimeric face task, requiring judgements of which side of chimeric face stimuli were more emotional, a face emotion recognition task, and the Multidimensional Social Competence Scale in an online experiment. Overall social competencies were predicted by degree of chimeric face task bias to the right hemisphere (determined as the laterality quotient, LQ). Structural Equation Model analyses revealed that social inferencing and non-verbal sending skills were best predicted by LQ. In all analyses the predictive role of LQ was independent of face emotion recognition. Social cognition has previously been linked to the right hemisphere, but we report a novel relationship between lateralisation of face processing, and aspects of social competencies that encompass both understanding and the display of social cues.

  • Research Article
  • 10.1080/13854046.2025.2598352
Facial emotion recognition and empathy for pain in patients with type 2 diabetes mellitus
  • Dec 9, 2025
  • The Clinical Neuropsychologist
  • Gerardo Maldonado-Paz + 5 more

Objective: Type 2 diabetes mellitus (T2DM) is associated with cognitive decline, but its impact on social cognition remains poorly understood. This study investigated whether individuals with T2DM exhibit impairments in facial emotion recognition and empathy for pain, two domains crucial for daily interpersonal functioning that are often overlooked in neuropsychological assessments. Method: Seventy-six participants (37 with T2DM and 39 matched healthy controls) completed two validated social cognition tasks: a dynamic Facial Emotion Morphing Test and an empathy-for-pain task involving 25 animated scenarios (intentional, accidental, and neutral harm). Groups were matched for age, sex, and education. Analyses of covariance were conducted using Montreal Cognitive Assessment (MoCA) scores as covariates to control for global cognitive status. Results: Compared to controls, individuals with T2DM showed significantly lower overall emotion recognition accuracy (ηp 2 = 0.10) and fear recognition accuracy (ηp 2 = 0.06). In the empathy-for-pain task, they exhibited reduced intentionality comprehension (ηp 2 = 0.05, d = 0.73), increased attribution of harmful intent (ηp 2 = 0.05, d = −0.60), and harsher punishment judgments (ηp 2 = 0.08). These effects were of medium magnitude and were not explained by demographic, cognitive, or clinical factors. Conclusions: T2DM is associated with selective impairments in social cognition, even in the absence of global cognitive decline. These findings underscore the clinical utility of assessing social cognition in patients with T2DM, as such deficits may compromise interpersonal functioning and quality of life. Incorporating ecologically valid social cognition measures into neuropsychological evaluations may support early detection of brain dysfunction in metabolic conditions and inform interventions aimed at preserving social cognitive health.

  • Research Article
  • 10.1016/j.neuroscience.2025.12.015
The relationship between the salience network and facial emotion recognition in social anxiety disorder.
  • Dec 8, 2025
  • Neuroscience
  • Kohei Kurita + 7 more

The relationship between the salience network and facial emotion recognition in social anxiety disorder.

  • Research Article
  • 10.47626/1516-4446-2025-4448
Is a deficit in facial emotion recognition a predictor of risky alcohol use in young adults?
  • Dec 8, 2025
  • Revista brasileira de psiquiatria (Sao Paulo, Brazil : 1999)
  • Giovanna Petucco + 2 more

Is a deficit in facial emotion recognition a predictor of risky alcohol use in young adults?

  • Research Article
  • 10.1177/08919887251407123
Differential changes of Social Cognition According to cognitive State and Evolution in Parkinson’s Disease
  • Dec 7, 2025
  • Journal of Geriatric Psychiatry and Neurology
  • Roberto Fernández-Fernández + 5 more

Objectives Social Cognition (SC) can be impaired in Parkinson’s Disease (PD), yet its longitudinal evolution relative to cognitive status is unclear. This study examined whether SC deficits in PD patients suffers different changes based on baseline cognitive status and cognitive progression. Methods In this observational study 48 non‐demented PD patients (32 with normal cognition [PD‐CN], 16 with mild cognitive impairment [PD‐MCI]), and 22 healthy controls (HC) were assessed at baseline and after three years. SC was assessed for facial emotion recognition (FER), affective and cognitive Theory of Mind (ToM), and social behavior. A comprehensive neuropsychological battery provided domain-specific z-scores. Cognitive classification followed MDS Level II criteria. Adjusted linear mixed models examined SC changes. Delta scores for SC tasks and z-score changes were correlated. Results At baseline, PD-MCI patients scored lower on cognitive ToM than PD-CN and HC, with no significant group differences in affective ToM, FER, or social behavior. Over three years, PD-MCI patients experienced a significant decline in cognitive ToM compared to PD-CN and HC, while affective ToM and emotion recognition declined only relative to HC. The converters (n = 16) to a worse cognitive state (PD-CN to PD-MCI or PD-MCI to PDD) showed lower baseline cognitive ToM and steeper decline than stable patients. All SC changes correlated with visuospatial ability; affective ToM also correlated with memory, language and attention, and FER with memory and executive function. Conclusions Cognitive ToM declines in parallel with cognitive deterioration in PD, while remaining stable in PD-CN. SC measures may help identify patients at higher risk of cognitive decline.

  • Research Article
  • 10.1038/s41598-025-25393-7
No evidence for disruption of empathy and mentalizing by face coverage in multimodal settings
  • Dec 5, 2025
  • Scientific Reports
  • Eva Landmann + 2 more

In social interactions, we often encounter situations where a partner’s face is (partially) occluded, e.g., when wearing a mask. While emotion recognition in static faces is known to be less accurate under such conditions, we investigated whether these detrimental effects extend to empathic responding, mentalizing (i.e., Theory of Mind), and prosociality in more naturalistic settings. In four studies (Ntotal = 157), we presented short video clips of narrators recounting neutral and emotionally negative autobiographical stories, with their faces shown in four conditions (two per experiment): fully visible, eyes covered, mouth covered, and audio-only. Participants then responded to questions assessing affect, mentalizing performance, and willingness to help. Affect ratings were slightly lower when the narrator’s mouth was covered, and participants were less willing to help narrators with covered eyes. Importantly, however, empathic responding and mentalizing performance remained robust across visibility conditions. Thus, our findings suggest that social understanding – specifically, empathizing and mentalizing – is not substantially impeded by partial or complete facial occlusion, when other cues, such as vocal information, can be used to compensate. These insights may help contextualize concerns about detrimental effects of face coverage in social interactions.Supplementary InformationThe online version contains supplementary material available at 10.1038/s41598-025-25393-7.

  • Research Article
  • 10.1080/02699931.2025.2596318
Stable abnormalities on the recognition of dynamic angry facial emotional expression in subthreshold depression
  • Dec 4, 2025
  • Cognition and Emotion
  • Xu Luo + 2 more

ABSTRACT Subthreshold depression (StD), a subclinical depression state, exhibits high prevalence and elevates the risk of developing major depressive disorder. Previous studies have found that individuals with StD were impaired in facial emotional expression recognition, and yet these studies primarily used static rather than dynamical facial emotional expressions with relatively highly ecological validity. It remains unclear whether StD could be associated with impaired recognition of dynamic facial emotional expressions and whether the abnormalities could be stable over time. Forty-six individuals with StD and forty-five non-depressed individuals performed a dynamic and a static facial emotional expression recognition task, and they also performed a follow-up assessment with the same tasks as the initial assessment after a 4-month interval. In the dynamic task, StD individuals showed lower recognition thresholds only for the angry emotional expression at both the initial and follow-up assessments, compared to the non-depressed individuals. In the static task, the StD group demonstrated significantly higher accuracy only for angry expressions at the initial assessment but did not at the follow-up assessment. These results indicate that the dynamic facial expression recognition task, recruiting higher ecological validity relative to the static task, may be a potential tool as an auxiliary objective marker for depression.

  • Research Article
  • 10.3390/s25237375
ArecaNet: Robust Facial Emotion Recognition via Assembled Residual Enhanced Cross-Attention Networks for Emotion-Aware Human-Computer Interaction.
  • Dec 4, 2025
  • Sensors (Basel, Switzerland)
  • Jaemyung Kim + 1 more

Recently, the convergence of advanced sensor technologies and innovations in artificial intelligence and robotics has highlighted facial emotion recognition (FER) as an essential component of human-computer interaction (HCI). Traditional FER studies based on handcrafted features and shallow machine learning have shown a limited performance, while convolutional neural networks (CNNs) have improved nonlinear emotion pattern analysis but have been constrained by local feature extraction. Vision transformers (ViTs) have addressed this by leveraging global correlations, yet both CNN- and ViT-based single networks often suffer from overfitting, single-network dependency, and information loss in ensemble operations. To overcome these limitations, we propose ArecaNet, an assembled residual enhanced cross-attention network that integrates multiple feature streams without information loss. The framework comprises (i) channel and spatial feature extraction via SCSESResNet, (ii) landmark feature extraction from specialized sub-networks, (iii) iterative fusion through residual enhanced cross-attention, (iv) final emotion classification from the fused representation. Our research introduces a novel approach by integrating pre-trained sub-networks specialized in facial recognition with an attention mechanism and our uniquely designed main network, which is optimized for size reduction and efficient feature extraction. The extracted features are fused through an iterative residual enhanced cross-attention mechanism, which minimizes information loss and preserves complementary representations across networks. This strategy overcomes the limitations of conventional ensemble methods, enabling seamless feature integration and robust recognition. The experimental results show that the proposed ArecaNet achieved accuracies of 97.0% and 97.8% using the public databases, FER-2013 and RAF-DB, which were 4.5% better than the existing state-of-the-art method, PAtt-Lite, for FER-2013 and 2.75% for RAF-DB, and achieved a new state-of-the-art accuracy for each database.

  • Research Article
  • 10.3390/s25237320
An Explainable Framework for Mental Health Monitoring Using Lightweight and Privacy-Preserving Federated Facial Emotion Recognition
  • Dec 2, 2025
  • Sensors (Basel, Switzerland)
  • Dina Shehada + 3 more

The continuous analysis of emotional cues through facial emotion recognition (FER) systems can support mental health evaluation and psychological well-being monitoring systems. Most FER systems face privacy and trust concerns due to their centralized data approaches and lack of transparency, making potential deployment difficult. To address these concerns, a federated, explainability-driven FER framework designed to provide trustworthy and privacy-preserving emotion recognition with potential applications in mental health monitoring is proposed in this paper. The proposed lightweight Convolutional Neural Network (CNN) enables real-time inference while preserving high accuracy. Comprehensive evaluations on RAF-DB, ExpW, and FER2013 datasets, show that the proposed model demonstrates improved cross-dataset generalization compared to related works, achieving average accuracies of 75.5% and 74.3% in centralized and federated settings, respectively. Quantitative perturbation-based metrics, including Insertion and Deletion Area Under Curve (IAUC and DAUC), Average Drop (AD), Increase in Confidence (IC), Average Drop in Accuracy (ADA), and Active Pixel Ratio, were employed to objectively evaluate the quality and reliability of the model Grad-CAM++ explanations. The results confirm that model explainability enhances transparency and is directly associated with improved model performance.

  • Research Article
  • 10.1007/s40617-025-01133-1
Measurement of Emotions Tacting for Empathic Responding (METER): An Example of a Process for Creating an Inclusive Assessment of Emotion Recognition using Validated and Diverse Facial Expression Stimuli
  • Dec 2, 2025
  • Behavior Analysis in Practice
  • Lydia S Lindsey + 2 more

Abstract Many social skills, such as empathic responding, social referencing, and facial emotion recognition, require a variety of conditional discriminations under a wide array of stimulus conditions. Proficiency with these responses in the natural environment would involve the ability to identify a variety of emotions across a wide array of faces, genders, ages, ethnicities, and contexts. Using empirically validated stimuli within assessment contexts that represent a wide spectrum of diverse variations across relevant features increases the likelihood of teaching stimulus discriminations necessary for broadly applicable emotion tacting skills. Currently, there is little guidance in behavior analysis on how to conduct a comprehensive assessment of emotions tacting across diverse demographics using empirically validated stimuli. Therefore, this manuscript provides an example process we adopted to create a preliminary assessment of facial emotion recognition that includes empirically validated stimuli representing a multitude of diverse faces, which we named the “Measurement of Emotions Tacting for Empathic Responding” (METER). It is our hope this assessment tutorial will help bring awareness to the importance of identifying appropriate validated and demographically diverse stimuli, the issues that may arise from overlooking the importance of the stimuli we use to assess and teach complex social skills, and to encourage researchers and practitioners to develop inclusive assessments for a variety of social skills using validated and diverse stimuli to aid in developing both targeted and socially valid interventions.

  • Research Article
  • 10.1016/j.ins.2025.123005
EmoContextNet: A real-time adaptive large model with spatiotemporal-spectral fusion for multi-context facial emotion recognition
  • Dec 1, 2025
  • Information Sciences
  • Dewei Yu + 5 more

EmoContextNet: A real-time adaptive large model with spatiotemporal-spectral fusion for multi-context facial emotion recognition

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • .
  • .
  • .
  • 10
  • 1
  • 2
  • 3
  • 4
  • 5

Popular topics

  • Latest Artificial Intelligence papers
  • Latest Nursing papers
  • Latest Psychology Research papers
  • Latest Sociology Research papers
  • Latest Business Research papers
  • Latest Marketing Research papers
  • Latest Social Research papers
  • Latest Education Research papers
  • Latest Accounting Research papers
  • Latest Mental Health papers
  • Latest Economics papers
  • Latest Education Research papers
  • Latest Climate Change Research papers
  • Latest Mathematics Research papers

Most cited papers

  • Most cited Artificial Intelligence papers
  • Most cited Nursing papers
  • Most cited Psychology Research papers
  • Most cited Sociology Research papers
  • Most cited Business Research papers
  • Most cited Marketing Research papers
  • Most cited Social Research papers
  • Most cited Education Research papers
  • Most cited Accounting Research papers
  • Most cited Mental Health papers
  • Most cited Economics papers
  • Most cited Education Research papers
  • Most cited Climate Change Research papers
  • Most cited Mathematics Research papers

Latest papers from journals

  • Scientific Reports latest papers
  • PLOS ONE latest papers
  • Journal of Clinical Oncology latest papers
  • Nature Communications latest papers
  • BMC Geriatrics latest papers
  • Science of The Total Environment latest papers
  • Medical Physics latest papers
  • Cureus latest papers
  • Cancer Research latest papers
  • Chemosphere latest papers
  • International Journal of Advanced Research in Science latest papers
  • Communication and Technology latest papers

Latest papers from institutions

  • Latest research from French National Centre for Scientific Research
  • Latest research from Chinese Academy of Sciences
  • Latest research from Harvard University
  • Latest research from University of Toronto
  • Latest research from University of Michigan
  • Latest research from University College London
  • Latest research from Stanford University
  • Latest research from The University of Tokyo
  • Latest research from Johns Hopkins University
  • Latest research from University of Washington
  • Latest research from University of Oxford
  • Latest research from University of Cambridge

Popular Collections

  • Research on Reduced Inequalities
  • Research on No Poverty
  • Research on Gender Equality
  • Research on Peace Justice & Strong Institutions
  • Research on Affordable & Clean Energy
  • Research on Quality Education
  • Research on Clean Water & Sanitation
  • Research on COVID-19
  • Research on Monkeypox
  • Research on Medical Specialties
  • Research on Climate Justice
Discovery logo
FacebookTwitterLinkedinInstagram

Download the FREE App

  • Play store Link
  • App store Link
  • Scan QR code to download FREE App

    Scan to download FREE App

  • Google PlayApp Store
FacebookTwitterTwitterInstagram
  • Universities & Institutions
  • Publishers
  • R Discovery PrimeNew
  • Ask R Discovery
  • Blog
  • Accessibility
  • Topics
  • Journals
  • Open Access Papers
  • Year-wise Publications
  • Recently published papers
  • Pre prints
  • Questions
  • FAQs
  • Contact us
Lead the way for us

Your insights are needed to transform us into a better research content provider for researchers.

Share your feedback here.

FacebookTwitterLinkedinInstagram
Cactus Communications logo

Copyright 2026 Cactus Communications. All rights reserved.

Privacy PolicyCookies PolicyTerms of UseCareers