• All Solutions All Solutions Caret
    • Editage

      One platform for all researcher needs

    • Paperpal

      AI-powered academic writing assistant

    • R Discovery

      Your #1 AI companion for literature search

    • Mind the Graph

      AI tool for graphics, illustrations, and artwork

    • Journal finder

      AI-powered journal recommender

    Unlock unlimited use of all AI tools with the Editage Plus membership.

    Explore Editage Plus
  • Support All Solutions Support
    discovery@researcher.life
Discovery Logo
Sign In
Paper
Search Paper
Cancel
Pricing Sign In
  • My Feed iconMy Feed
  • Search Papers iconSearch Papers
  • Library iconLibrary
  • Explore iconExplore
  • Ask R Discovery iconAsk R Discovery Star Left icon
  • Chat PDF iconChat PDF Star Left icon
  • Chrome Extension iconChrome Extension
    External link
  • Use on ChatGPT iconUse on ChatGPT
    External link
  • iOS App iconiOS App
    External link
  • Android App iconAndroid App
    External link
  • Contact Us iconContact Us
    External link
Discovery Logo menuClose menu
  • My Feed iconMy Feed
  • Search Papers iconSearch Papers
  • Library iconLibrary
  • Explore iconExplore
  • Ask R Discovery iconAsk R Discovery Star Left icon
  • Chat PDF iconChat PDF Star Left icon
  • Chrome Extension iconChrome Extension
    External link
  • Use on ChatGPT iconUse on ChatGPT
    External link
  • iOS App iconiOS App
    External link
  • Android App iconAndroid App
    External link
  • Contact Us iconContact Us
    External link

Emotion Recognition Research Articles

  • Share Topic
  • Share on Facebook
  • Share on Twitter
  • Share on Mail
  • Share on SimilarCopy to clipboard
Follow Topic R Discovery
By following a topic, you will receive articles in your feed and get email alerts on round-ups.
Overview
16052 Articles

Published in last 50 years

Related Topics

  • Emotion Recognition Task
  • Emotion Recognition Task
  • Emotion Recognition System
  • Emotion Recognition System
  • Emotional Speech
  • Emotional Speech
  • Affect Recognition
  • Affect Recognition

Articles published on Emotion Recognition

Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
15397 Search results
Sort by
Recency
  • New
  • Research Article
  • 10.54531/colr9799
A62 Teaching Hot Debriefing to Paediatric Resident Doctors: Cultivating a Culture of Reflection and Psychological Safety
  • Nov 4, 2025
  • Journal of Healthcare Simulation
  • Sabah Hussain + 2 more

Introduction: In high-pressure clinical environments, fostering a culture that encourages reflection, learning, and emotional wellbeing is essential. Hot debriefing offers an immediate, structured opportunity for teams to reflect on critical events, strengthen communication, and embed psychological safety into regular practice [1]. This teaching session aimed to educate resident paediatric doctors on the importance of a hot debrief and introduce relevant models that supports cultural transformation by normalising reflective practice. Methods: A multidisciplinary teaching session was delivered to 25 resident paediatric doctors, focusing on the practical application of hot debriefing. The session included a structured approach and a set of practical tools for initiating team-based hot debriefs. Through the use of videos and simulations we were able to embed principles of psychological safety, emotional recognition, and inclusive dialogue. In order to facilitate real-time feedback, gather the thoughts of the resident doctors and enable a collaborative environment we utilised Slido within this session. Pre- and post-session surveys were used to assess changes in experience and confidence, and to identify future training needs. Qualitative comments were collected to capture perceived cultural and emotional impact. Results: Pre-course data showed that 80% of participants had little or no prior experience with hot debriefing. Following the session, 84% reported feeling moderately or much more confident in asking for a debrief. Additionally, 84% expressed interest in receiving further training on how to lead debriefs. Qualitative feedback consistently highlighted a shift in attitude toward team communication and support, with participants valuing the normalisation of discussing emotional responses. Many viewed the session as a catalyst for change, helping to challenge existing cultural norms around silence after difficult events and learning from these. Discussion: The introduction of hot debriefing as both a concept and a structured practice contributed to a visible cultural shift within clinical teams. Rather than treating debriefs as optional or exceptional, the session repositioned them as integral to team-based care and resilience. By normalising immediate reflection, hot debriefing supports a compassionate, safety-oriented culture that prioritises emotional well-being alongside clinical outcomes. As healthcare organisations aim to address burnout, improve safety, and foster inclusive team dynamics, scalable interventions like hot debriefing can serve as foundational tools to drive cultural transformation from the ground up [2]. Going forward, we would like to deliver these sessions to all paediatric resident doctors and incorporate more simulation-based education within it to enhance a team culture that supports open communication, compassion, and continuous learning. Ethics Statement: As the submitting author, I can confirm that all relevant ethical standards of research and dissemination have been met. Additionally, I can confirm that the necessary ethical approval has been obtained, where applicable.

  • New
  • Research Article
  • 10.7717/peerj-cs.3287
Adaptive multitask emotion recognition and sentiment analysis using resource-constrained MobileBERT and DistilBERT: an efficient approach for edge devices
  • Nov 3, 2025
  • PeerJ Computer Science
  • Muhammad Hussain + 6 more

Emotion recognition and sentiment analysis are crucial tasks in natural language processing, enabling machines to understand human emotions and opinions. However, the complex, nuanced relationship between emotions and sentiment in conversation poses significant challenges to accurate emotion recognition, as sentiment cues can be easily misinterpreted. Deploying emotion recognition and sentiment analysis tasks on edge devices poses substantial challenges due to computational resource constraints. We present an adaptive multitask learning approach that jointly leverages resource-constrained Mobile Bidirectional Encoder Representations from Transformers (MobileBERT) and Distilled BERT (DistilBERT) models to optimise emotion recognition and sentiment analysis. Our proposed approach utilises prototypical networks to learn effective representations of emotions and sentiment, while a focal weighted loss function effectively mitigates the class imbalance. We adaptively fine-tune the learning process to balance task importance and resource utilisation, resulting in better performance and efficiency. Our experimental results demonstrate the efficacy of our method, achieving the best results on MELD and IEMOCAP benchmark datasets while keeping a compact model size. Despite limited computational demands, our solution demonstrates that emotion and sentiment analysis can deliver performance comparable to resource-intensive large language models (LLMs). Facilitating various applications in human-computer interaction, affective computing, social media, dialogue conversion, and healthcare.

  • New
  • Research Article
  • 10.1145/3774428
EEG-based Multimodal Emotion Recognition: Recent Progress, Challenges, and Future Directions
  • Nov 3, 2025
  • ACM Transactions on Multimedia Computing, Communications, and Applications
  • Ghulam Muhammad + 4 more

Emotion recognition is a crucial part of cognitive computing. Traditional emotion recognition systems include audio-visual modality. However, a recent trend in recognizing emotions is to use physiological signals such as the electroencephalogram (EEG). EEG signals, together with audio-visual and other physiological signals, improve the performance of emotion recognition systems. This paper presents a systematic literature review on EEG-based multimodal (multimedia) emotion recognition systems for the last five years. Three major research questions are addressed: (1) What kind of learning models are used in EEG-based multimedia emotion recognition? (2) What are the publicly available related datasets? (3) What are the challenges and future directions of this topic? The answers to the research questions are provided in different subsections.

  • New
  • Research Article
  • 10.1038/s41380-025-03324-2
Altered acetylcholine modulations and corticoaccumbal pathway in P11-linked social dysfunction.
  • Nov 3, 2025
  • Molecular psychiatry
  • Daniel Dautan + 10 more

Social relationships rely on the willingness to interact with others and the ability to interpret their emotional cues. Major depressive disorder (MDD) often leads to dysfunctional social interactions, marked by reduced social motivation and difficulties in recognizing emotions, yet these issues remain inadequately explored despite their significant impact on quality of life. These social behaviors, interconnected through the corticoaccumbal pathway, balance anxiety and social interaction, but the underlying mechanisms remain poorly understood. Notably, the calcium-binding protein S100A10 (also known as P11), which is dysregulated in MDD patients and influences the response to antidepressants, is prominently expressed in brain structures involved in social and emotional processing. Here, we demonstrate that chronic restraint stress alters P11 expression along the corticoaccumbal circuit. Additionally, our genetic model, P11-knockout mice, exhibit depression-like behavior, including a reduction of social motivation and impaired recognition of conspecific emotions. Using in vivo and ex vivo electrophysiology, we reveal that P11 expression modulates the response of the corticoaccumbal pathway, influencing the balance between anxiety and social interaction, as well as emotion recognition by regulating dopamine and acetylcholine release in the accumbens. Interestingly, we pinpoint the role of different cholinergic structures in anxiety, social motivation, and emotion recognition. Finally, we show that the prosocial compound oxytocin and social buffering therapy were able to rescue the socially impaired behaviors following chronic stress or P11 ablation, opening new avenues for potential treatments.

  • New
  • Research Article
  • 10.52015/nijec.v4i1.92
A Novel Framework for the Accuracy Enhancement of Facial Expression Recognition System
  • Nov 3, 2025
  • NUML International Journal of Engineering and Computing
  • Naila Batool + 2 more

Facial Expression Recognition has become a promising field for more natural interactivity with computing devices and machines and has become the focus of attention for many research scholars over the past decade. Newly developed facial emotions recognition methods focus on neutral expression or six expressions used in most state-of-the-art methods. Accuracy is the main problem in the face recognition results. The problem that must be tackled is the optimization of the expression recognition algorithm i.e. to detect, isolate and correctly translate one of the major expressions of the human face with accuracy targeted towards 100%. This work will try to improve the accuracy of recognizing facial expression by using Histogram of Oriented Gradients (HOG) and Local Ternary Pattern (LTP).

  • New
  • Research Article
  • 10.56919/2543.004
A Review of Emotion Recognition in Virtual Learning Environments and its Educational Impact
  • Nov 3, 2025
  • UMYU Scientifica
  • Joshua Jimba + 5 more

Understanding and addressing students' emotional needs is crucial in the rapidly evolving domain of online learning, as it fosters students' motivation, interest, and educational outcomes. This literature review examines the methods, findings, and implications of recent studies that attempt to identify and analyze emotions in online learning contexts. Methodologically, a systematic review approach was employed to analyze a wide variety of academic publications released between 2021 and 2024. The survey-encompassing studies employed various methods to recognize emotions, such as happiness, sadness, and interest in virtual learning environments, including physiological signal analysis, deep learning models, and machine learning algorithms. The outcome of the literature review points out significant progress in the area of emotion detection technology where studies depict how effectively deep learning and machine learning models can recognize and interpret students' emotional expression along with effectively identifying them, Finding from the reviewed papers shows that models like CNN, LSTM, SVM, ViT, and brain-computer interfaces have been employed with varying degrees of accuracy (ranging from 55% to over 90%). In addition, using real-time feedback mechanisms that recognize emotions has the potential to improve learning outcomes, motivation, and student engagement in online learning environments

  • New
  • Research Article
  • 10.32996/jbms.2025.7.7.4
Construction of an Adaptive Communication Model for Cross-border E-commerce AI Customer Service under Cultural Context Differences: A Case Study of the Russian Market
  • Nov 2, 2025
  • Journal of Business and Management Studies
  • Huizhu Tan

Against the backdrop of the rapid expansion of Sino-Russian cross-border e-commerce, AI customer service systems face significant challenges in adapting to high-context cultural environments. Based on Hall’s high-/low-context theory, this study analyzes typical user complaint cases from the Russian market and identifies a structural mismatch between the low-context communication patterns of existing AI customer service systems and the high-context expectations of Russian users. This mismatch manifests in three key dimensions: rigid language, poor contextual interpretation, and an inability to build relationships. To address these issues, this paper proposes an adaptive communication model comprising three integrated layers: a context perception layer, a strategy generation layer, and an interaction execution layer. Driven by a culturally sensitive communication strategy library, the model is designed to shift AI customer service from standardized responses to contextualized communication. By integrating mechanisms such as emotion recognition, indirect intent inference, and relationship-building dialogue, the model aims to enhance user satisfaction and trust in high-context markets. This study offers a structured framework for incorporating cultural theory into the design of cross-border AI customer service systems and suggests directions for future research, including expansion to other high-context regions and multimodal interaction.

  • New
  • Research Article
  • 10.1016/j.yhbeh.2025.105843
Emotion recognition largely unaffected by combined oral contraceptive transitions or their androgenicity.
  • Nov 1, 2025
  • Hormones and behavior
  • Ann-Christin S Kimmig + 2 more

Emotion recognition largely unaffected by combined oral contraceptive transitions or their androgenicity.

  • New
  • Research Article
  • 10.1016/j.neunet.2025.107853
ACCNet: Adaptive cross-frequency coupling graph attention for EEG emotion recognition.
  • Nov 1, 2025
  • Neural networks : the official journal of the International Neural Network Society
  • Dongyuan Tian + 6 more

ACCNet: Adaptive cross-frequency coupling graph attention for EEG emotion recognition.

  • New
  • Research Article
  • 10.1016/j.yebeh.2025.110549
Facial emotion recognition in focal epilepsy: localization is not the main factor.
  • Nov 1, 2025
  • Epilepsy & behavior : E&B
  • Mathilde Grangé + 6 more

Facial emotion recognition in focal epilepsy: localization is not the main factor.

  • New
  • Research Article
  • 10.1016/j.pedn.2025.09.017
Applications of artificial intelligence in emotion recognition in pediatrics health care: Scoping review.
  • Nov 1, 2025
  • Journal of pediatric nursing
  • Ana Rita Figueiredo + 4 more

Applications of artificial intelligence in emotion recognition in pediatrics health care: Scoping review.

  • New
  • Research Article
  • 10.3390/s25216670
A Facial-Expression-Aware Edge AI System for Driver Safety Monitoring
  • Nov 1, 2025
  • Sensors
  • Maram A Almodhwahi + 1 more

Road safety has emerged as a global issue, driven by the rapid rise in vehicle ownership and traffic congestion. Human error, like distraction, drowsiness, and panic, is the leading cause of road accidents. Conventional driver monitoring systems (DMSs) frequently fail to detect these emotional and cognitive states, limiting their potential to prevent accidents. To overcome these challenges, this work proposes a robust deep learning-based DMS framework capable of real-time detection and response to emotion-driven driver behaviors that pose safety risks. The proposed system employs convolutional neural networks (CNNs), specifically the Inception module and a Caffe-based ResNet-10 with a Single Shot Detector (SSD), to achieve efficient, accurate facial detection and classification. The DMS is trained on a comprehensive and diverse dataset from various public and private sources, ensuring robustness across a wide range of emotions and real-world driving scenarios. This approach enables the model to achieve an overall accuracy of 98.6%, an F1 score of 0.979, a precision of 0.980, and a recall of 0.979 across the four emotional states. Compared with existing techniques, the proposed model strikes an effective balance between computational efficiency and complexity, enabling the precise recognition of driving-relevant emotions, making it a practical and high-performing solution for real-world in-car driver monitoring systems.

  • New
  • Research Article
  • 10.1016/j.bspc.2025.108016
Multi-source domain separation adversarial domain adaptation for EEG emotion recognition
  • Nov 1, 2025
  • Biomedical Signal Processing and Control
  • Qingsong Ai + 3 more

Multi-source domain separation adversarial domain adaptation for EEG emotion recognition

  • New
  • Research Article
  • 10.1016/j.patcog.2025.111720
Context transformer with multiscale fusion for robust facial emotion recognition
  • Nov 1, 2025
  • Pattern Recognition
  • Yanling Gan + 3 more

Context transformer with multiscale fusion for robust facial emotion recognition

  • New
  • Research Article
  • 10.1016/j.specom.2025.103313
MDCNN: A multimodal dual-CNN recursive model for fake news detection via audio- and text-based speech emotion recognition
  • Nov 1, 2025
  • Speech Communication
  • Hongchen Wu + 13 more

MDCNN: A multimodal dual-CNN recursive model for fake news detection via audio- and text-based speech emotion recognition

  • New
  • Research Article
  • 10.1016/j.inffus.2025.103279
A self-supervised data augmentation strategy for EEG-based emotion recognition
  • Nov 1, 2025
  • Information Fusion
  • Yingxiao Qiao + 1 more

A self-supervised data augmentation strategy for EEG-based emotion recognition

  • New
  • Research Article
  • 10.1016/j.schres.2025.08.012
Psychometric validation of social cognition measures in U.S. Hispanic individuals with schizophrenia.
  • Nov 1, 2025
  • Schizophrenia research
  • Ana T Flores + 6 more

Psychometric validation of social cognition measures in U.S. Hispanic individuals with schizophrenia.

  • New
  • Research Article
  • 10.1016/j.engappai.2025.111721
Coverage-guaranteed speech emotion recognition via calibrated uncertainty-adaptive prediction sets
  • Nov 1, 2025
  • Engineering Applications of Artificial Intelligence
  • Zijun Jia + 3 more

Coverage-guaranteed speech emotion recognition via calibrated uncertainty-adaptive prediction sets

  • New
  • Research Article
  • 10.1016/j.asoc.2025.113659
Enhancing EEG-based individual-generic emotion recognition through invariant sparse patterns extracted from ongoing affective processes
  • Nov 1, 2025
  • Applied Soft Computing
  • Yiwen Zhu + 5 more

Enhancing EEG-based individual-generic emotion recognition through invariant sparse patterns extracted from ongoing affective processes

  • New
  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.inffus.2025.103268
RMER-DT: Robust multimodal emotion recognition in conversational contexts based on diffusion and transformers
  • Nov 1, 2025
  • Information Fusion
  • Xianxun Zhu + 6 more

RMER-DT: Robust multimodal emotion recognition in conversational contexts based on diffusion and transformers

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • .
  • .
  • .
  • 10
  • 1
  • 2
  • 3
  • 4
  • 5
Discovery logo
FacebookTwitterLinkedinInstagram

Download the FREE App

  • Play store Link
  • App store Link
  • Scan QR code to download FREE App

    Scan to download FREE App

  • Google PlayApp Store
FacebookTwitterTwitterInstagram
  • Universities & Institutions
  • Publishers
  • R Discovery PrimeNew
  • Ask R Discovery
  • Blog
  • Accessibility
  • Topics
  • Journals
  • Open Access Papers
  • Year-wise Publications
  • Recently published papers
  • Pre prints
  • Questions
  • FAQs
  • Contact us
Lead the way for us

Your insights are needed to transform us into a better research content provider for researchers.

Share your feedback here.

FacebookTwitterLinkedinInstagram
Cactus Communications logo

Copyright 2025 Cactus Communications. All rights reserved.

Privacy PolicyCookies PolicyTerms of UseCareers