Enhancing EEG-based individual-generic emotion recognition through invariant sparse patterns extracted from ongoing affective processes
Enhancing EEG-based individual-generic emotion recognition through invariant sparse patterns extracted from ongoing affective processes
- Dissertation
- 10.53846/goediss-4980
- Feb 20, 2022
The impact of vocal expressions on the understanding of affective states in others
- Research Article
4
- 10.1016/j.bionps.2024.100106
- Aug 23, 2024
- Biomarkers in Neuropsychiatry
We investigated affective processing in aphantasia (= absent or reduced vividness of mental imagery), considering a possible overlap with alexithymia (= deficits in identifying and describing emotions), as reduced vividness of mental imagery is also reported in alexithymia. Study 1 assessed physiological reactions and self-reported sympathy in n = 30 individuals with aphantasia and n = 75 controls when confronted to visual and verbal material showing people in distress. Results demonstrated that individuals with aphantasia show reduced emotional responses, especially to verbal stimuli. This is of particular importance given the higher prevalence of alexithymic symptoms in aphantasic participants, notably in externally-oriented thinking and difficulties in describing feelings. An additional mediation analysis confirmed that vividness of visual imagery mediated the association between alexithymia and self-reported sympathy. Study 2 extended our exploration to the recognition of emotions in others using the same sample. Despite accurate recognition of emotions, individuals with aphantasia exhibited significantly slower response times, suggesting less efficient strategies that do not involve mental imagery. Our findings highlight the crucial role of mental imagery in the interplay of cognitive functions and affective processes, demonstrating how conditions such as aphantasia and alexithymia can affect sympathy and, more generally, emotions.
- Research Article
20
- 10.1176/appi.neuropsych.20.1.86
- Feb 1, 2008
- Journal of Neuropsychiatry
The Neuropsychiatry and Neuropsychology Of Lipoid Proteinosis
- Research Article
5
- 10.26021/1826
- Jan 1, 2007
Empirical research provides evidence of strong interaction between cognitive and affective processes in the human mind. Education research proposes a model of constructive learning that relates cognitive and affective processes in an evolving cycle of affective states. Intelligent Tutoring Systems (ITSs) are capable of providing comprehensive cognitive support. Affective support in ITSs, however, is lagging behind; the in-depth exploration of cognitive and affective processes in ITSs is yet to be seen. Our research focuses on the integration of affective support in an ITS enhanced with an affective pedagogical agent. In our work we adopt the dimensional (versus categorical) view of emotions for modelling affective states of the agent and the ITSs users. In two stages we develop and evaluate an affective pedagogical agent. The affective response of the first agent version is based on the appraisal of the interaction state; this agent’s affective response is displayed as affective facial expressions. The pilot study at the end of the first stage of the project confirms the viability of our approach which combines the dimensional view of emotions with the appraisal of interaction state. In the second stage of the project we develop a facial feature tracking application for real-time emotion recognition in a video-stream. Affective awareness of the second version of the agent is based on the output from the facial feature tracking application and the appraisal of the interaction state. This agent’s response takes the form of affectoriented messages designed to interrupt the state of negative flow. The evaluation of the affect-aware agent against an unemotional affect-unaware agent provides positive results, thus confirming the superiority of the affect-aware agent. Although the uptake of the agent was not unanimous, the agent established and maintained good rapport with the users in a role of a caring tutor. The results of the pilot study and the final evaluation validate our choices in the design of affective interaction. In both experiments, the participants appreciated the addition of audible feedback messages, describing it as an enhancement which helped them save time and maintain their focus. Finally, we offer directions for future research on affective support which can be conducted within the framework developed in the course of this project.
- Research Article
88
- 10.1016/j.biopsych.2009.01.026
- Mar 6, 2009
- Biological Psychiatry
Association of Impaired Facial Affect Recognition with Basic Facial and Visual Processing Deficits in Schizophrenia
- Research Article
- 10.1088/1361-6579/ad2eb6
- Mar 1, 2024
- Physiological Measurement
Objective. Extracting discriminative spatial information from multiple electrodes is a crucial and challenging problem for electroencephalogram (EEG)-based emotion recognition. Additionally, the domain shift caused by the individual differences degrades the performance of cross-subject EEG classification. Approach. To deal with the above problems, we propose the cerebral asymmetry representation learning-based deep subdomain adaptation network (CARL-DSAN) to enhance cross-subject EEG-based emotion recognition. Specifically, the CARL module is inspired by the neuroscience findings that asymmetrical activations of the left and right brain hemispheres occur during cognitive and affective processes. In the CARL module, we introduce a novel two-step strategy for extracting discriminative features through intra-hemisphere spatial learning and asymmetry representation learning. Moreover, the transformer encoders within the CARL module can emphasize the contributive electrodes and electrode pairs. Subsequently, the DSAN module, known for its superior performance over global domain adaptation, is adopted to mitigate domain shift and further improve the cross-subject performance by aligning relevant subdomains that share the same class samples. Main Results. To validate the effectiveness of the CARL-DSAN, we conduct subject-independent experiments on the DEAP database, achieving accuracies of 68.67% and 67.11% for arousal and valence classification, respectively, and corresponding accuracies of 67.70% and 67.18% on the MAHNOB-HCI database. Significance. The results demonstrate that CARL-DSAN can achieve an outstanding cross-subject performance in both arousal and valence classification.
- Research Article
11
- 10.3390/app12094236
- Apr 22, 2022
- Applied Sciences
The recognition of human emotions is expected to completely change the mode of human-computer interaction. In emotion recognition research, we need to focus on accuracy and real-time performance in order to apply emotional recognition based on physiological signals to solve practical problems. Considering the timeliness dimension of emotion recognition, we propose a terminal-edge-cloud system architecture. Compared to traditional sentiment computing architectures, the proposed architecture in this paper reduces the average time consumption by 15% when running the same affective computing process. Proposed Joint Mutual Information (JMI) based feature extraction affective computing model, and we conducted extensive experiments on the AMIGOS dataset. Through experimental comparison, this feature extraction network has obvious advantages over the commonly used methods. The model performs sentiment classification, and the average accuracy of valence and arousal is 71% and 81.8%, compared with recent similar sentiment classifier research, the average accuracy is improved by 0.85%. In addition, we set up an experiment with 30 people in an online learning scenario to validate the computing system and algorithm model. The result proved that the accuracy and real-time recognition were satisfactory, and improved the online learning real-time emotional interaction experience.
- Research Article
3
- 10.1016/j.cortex.2022.03.005
- Mar 23, 2022
- Cortex
Changes in functional connectivity associated with facial expression processing over the working adult lifespan
- Research Article
39
- 10.1037/a0033748
- Jan 1, 2013
- Behavioral Neuroscience
The ability to understand thoughts and feelings of another person is an important prerequisite for successful social interaction. One part of this ability is the recognition of emotions in the face of the counterpart. Knowledge on genetic contributions to emotion recognition is still scarce. In the present study, 105 healthy participants were experimentally tested for their ability to recognize complex emotions in faces. As prior studies outlined the importance of the oxytocin system for emotion recognition, the functional rs2268498 polymorphism on the OXTR-gene was investigated. Although there were no differences in reaction times between genotype groups, carriers of the T-allele exhibited more accurate recognition skills than subjects carrying the CC-genotype. There was no influence of gender or age. Results support recent findings, demonstrating the importance of the oxytocin system for affect processing and related social behavior.
- Research Article
32
- 10.1016/s0893-6080(99)00096-9
- Mar 1, 2000
- Neural Networks
A recurrent model of transformation invariance by association
- Research Article
- 10.5080/u27600
- Jan 1, 2025
- Turk psikiyatri dergisi = Turkish journal of psychiatry
Culture plays a prominent role in recognition and rating of emotions. This study aims to develop a standardized measurement tool specific to Türkiye for assessing affect and recognizing emotions. The tool is designed to be brief and practical for use as a bedside test in clinical settings. Data were collected from 610 university students (psychology majors). The scale consisted of 500 black-and-white photographs taken under standard conditions by a professional photographer, depicting seven emotions (anxiety, fear, anger, joy, surprise, disgust, and sadness). Through four selection/elimination stages, the initial 500 photographs were reduced to 22. Expert opinions were gathered to assess the content validity of the test. Item reliability was assessed using the test-retest method, and the reliability coefficient was calculated using the Gwet AC1 technique. Following these stages, the final 20 photographs formed the Brief Affect and Emotion Recognition Test (BAET). The normative emotion recognition percentages for the 20 items ranged between 42.2% and 95.6%. Normative affect intensity scores ranged from 2.3 to 4.8. The Gwet AC1 reliability coefficient of the BAET was calculated as 73.2. In this study, a culture-specific test was developed to measure affect and emotion recognition processes, and its content validity and reliability were assessed. The findings indicate that the Brief Affect and Emotion Recognition Test (BAET) is a valid and reliable measurement tool, introducing a brief and practical test to the field.
- Research Article
4
- 10.3389/fnhum.2023.1132254
- Jun 1, 2023
- Frontiers in Human Neuroscience
IntroductionEEG signals can non-invasively monitor the brain activities and have been widely used in brain-computer interfaces (BCI). One of the research areas is to recognize emotions objectively through EEG. In fact, the emotion of people changes over time, however, most of the existing affective BCIs process data and recognize emotions offline, and thus cannot be applied to real-time emotion recognition.MethodsIn order to solve this problem, we introduce the instance selection strategy into transfer learning and propose a simplified style transfer mapping algorithm. In the proposed method, the informative instances are firstly selected from the source domain data, and then the update strategy of hyperparameters is also simplified for style transfer mapping, making the model training more quickly and accurately for a new subject.ResultsTo verify the effectiveness of our algorithm, we carry out the experiments on SEED, SEED-IV and the offline dataset collected by ourselves, and achieve the recognition accuracies up to 86.78%, 82.55% and 77.68% in computing time of 7s, 4s and 10s, respectively. Furthermore, we also develop a real-time emotion recognition system which integrates the modules of EEG signal acquisition, data processing, emotion recognition and result visualization.DiscussionBoth the results of offline and online experiments show that the proposed algorithm can accurately recognize emotions in a short time, meeting the needs of real-time emotion recognition applications.
- Research Article
3
- 10.3389/fpsyg.2021.644704
- Mar 12, 2021
- Frontiers in Psychology
Facial emotional recognition is something used often in our daily lives. How does the brain process the face search? Can taste modify such a process? This study employed two tastes (sweet and acidic) to investigate the cross-modal interaction between taste and emotional face recognition. The behavior responses (reaction time and correct response ratios) and the event-related potential (ERP) were applied to analyze the interaction between taste and face processing. Behavior data showed that when detecting a negative target face with a positive face as a distractor, the participants perform the task faster with an acidic taste than with sweet. No interaction effect was observed with correct response ratio analysis. The early (P1, N170) and mid-stage [early posterior negativity (EPN)] components have shown that sweet and acidic tastes modified the ERP components with the affective face search process in the ERP results. No interaction effect was observed in the late-stage (LPP) component. Our data have extended the understanding of the cross-modal mechanism and provided electrophysiological evidence that affective facial processing could be influenced by sweet and acidic tastes.
- Research Article
21
- 10.12740/pp/38919
- Jan 1, 2015
- Psychiatria Polska
Emotion recognition is an important aspect of social interactions. Patients suffering from schizophrenia exhibit some disturbances in affective processing. The aim of the study was the evaluation of facial emotion perception and its relation to the psychotic symptoms in schizophrenia patients. 102 patients with schizophrenia (F20.0, ICD 10) and 50 healthy volunteers participated in the study; all the subjects were 18-60 years old. Psychical condition was assessed with following diagnostic tools: CGI (Clinical Global Impression Scale), PANSS (Positive and Negative Syndromes Scale), CDSS (Calgary Depression Scale for Schizophrenia), UKU (Side Effect Rating Scale). Facial emotion recognition ability was assessed by SIE-T (Emotional Intelligence Scale - Faces). On the basis of gathered data it was found that patients suffering from schizophrenia performed worse on facial emotion recognition task compared to the healthy subjects. Severity of negative symptoms corresponded with the facial emotion perception impairment. There was no relation found between age of schizophrenia-onset and level of the facial emotion perception impairment, but the facial emotion recognition ability was worsening with the age of the subjects, both healthy and suffering from schizophrenia. Severity of schizophrenia corresponded with the facial emotion perception impairment.
- Research Article
6
- 10.1016/j.schres.2021.11.027
- Dec 7, 2021
- Schizophrenia Research
Real-time facial emotion recognition deficits across the psychosis spectrum: A B-SNIP Study
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.