Evaluating human perceptions of android robot facial expressions based on variations in instruction styles

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Robots that interact with humans are required to express emotions in ways that are appropriate to the context. While most prior research has focused primarily on basic emotions, real-life interactions demand more nuanced expressions. In this study, we extended the expressive capabilities of the android robot Nikola by implementing 63 facial expressions, covering not only complex emotions and physical conditions, but also differences in intensity. At Expo 2025 in Japan, more than 600 participants interacted with Nikola by describing situations in which they wanted the robot to perform facial expressions. The robot inferred emotions using a large language model and performed corresponding facial expressions. Questionnaire responses revealed that participants rated the robot’s behavior as more appropriate and emotionally expressive when their instructions were abstract, compared to when they explicitly included emotions or physical states. This suggests that abstract instructions enhance perceived agency in the robot. We also investigated and discussed how impressions towards the robot varied depending on the expressions it performed and the personality traits of participants. This study contributes to the research field of human–robot interaction by demonstrating how adaptive facial expressions, in association with instruction styles, are linked to shaping human perceptions of social robots.

Similar Papers
  • Research Article
  • 10.3390/children12070816
Development and Validation of the Children's Emotions Database (CED): Preschoolers' Basic and Complex Facial Expressions.
  • Jun 21, 2025
  • Children (Basel, Switzerland)
  • Nadia Koltcheva + 1 more

Background. Emotions are a crucial part of our human nature. The recognition of emotions is an essential component of our social and emotional skills. Facial expressions serve as a key element in discerning others' emotions. Different databases of images of facial emotion expressions exist worldwide; however, most of them are limited to only adult faces and include only the six basic emotions, as well as neutral faces, ignoring more complex emotional expressions. Here, we present the Children's Emotions Database (CED), a novel repository featuring both basic and complex facial expressions captured from preschool-aged children. The CED is one of the first databases to include complex emotional expressions in preschoolers. Our aim was to develop such a database that can be used further for research and applied purposes. Methods. Three 6-year-old children (one female) were photographed while showing different facial emotional expressions. The photos were taken under standardized conditions. The children were instructed to express each of the following basic emotions: happiness, pleasant surprise, sadness, fear, anger, disgust; a neutral face; and four complex emotions: pride, guilt, compassion, and shame; this resulted in a total of eleven expressions for each child. Two photos per child were reviewed and selected for validation. The photo validation was performed with a sample of 104 adult raters (94 females; aged 19-70 years; M = 29.9; SD = 11.40) and a limited sample of 32 children at preschool age (17 girls; aged 4-7 years; M = 6.5; SD = 0.81). The validation consisted of two tasks-free emotion labeling and emotion recognition (with predefined labels). Recognition accuracy for each expression was calculated. Results and Conclusions. While basic emotions and neutral expressions were recognized with high accuracy, complex emotions were less accurately identified, consistent with the existing literature on the developmental challenges in recognizing such emotions. The current work is a promising new database of preschoolers' facial expressions consisting of both basic and complex emotions. This database offers a valuable resource for advancing research in emotional development, educational interventions, and clinical applications tailored to early childhood.

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.biopsycho.2025.109072
The influence of facial expression absence on the recognition of different emotions: Evidence from behavioral and event-related potentials studies.
  • Jul 1, 2025
  • Biological psychology
  • Juan Song + 3 more

The influence of facial expression absence on the recognition of different emotions: Evidence from behavioral and event-related potentials studies.

  • Research Article
  • Cite Count Icon 3
  • 10.1080/13854046.2017.1418024
Dissociation between facial and bodily expressions in emotion recognition: A case study
  • Dec 21, 2017
  • The Clinical Neuropsychologist
  • Samanta Leiva + 3 more

Objective: Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. Methods: In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite’s operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. Results: There were no statistically significant differences between the patient’s and the control group’s performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient’s performance was compared to the control group’s, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient’s and the control group’s correct answers for emotional facial stimuli. Conclusions: Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 7
  • 10.1038/srep11795
Exaggerated perception of facial expressions is increased in individuals with schizotypal traits
  • Jul 2, 2015
  • Scientific Reports
  • Shota Uono + 2 more

Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions.

  • Dissertation
  • 10.13097/archive-ouverte/unige:9620
The perception of emotional facial expressions in normals and in schizophrenic patients
  • Jan 1, 2010
  • P Casati

The research compares the perception of facial expressions (FE) of emotions in controls and schizophrenic patients. Three studies are presented. The first and the second study use a new morphing technique to test whether Categorical Perception (CP) exists for static displays of FE. The joint analysis of an identification (first study) and a discrimination task using morphing transitions between Neuter-Happiness and Neuter-Anger expressions demonstrate the presence of CP only for Anger in controls. CP does not occur in first-episode schizophrenia patients. The third study tests the identification of the six basic emotions in dynamic conditions. The results of a recognition task of FE from video clips showing the transformation between a neutral face and an emotional face demonstrates that schizophrenia patients need more time and a more intense facial activity than controls to detect basic emotions. It also shows that patients show selective impairments in the identification of Fear and Disgust.

  • Research Article
  • Cite Count Icon 8
  • 10.1016/j.actpsy.2019.102941
The asynchronous influence of facial expressions on bodily expressions
  • Sep 1, 2019
  • Acta Psychologica
  • Mingming Zhang + 5 more

The asynchronous influence of facial expressions on bodily expressions

  • Research Article
  • Cite Count Icon 16
  • 10.2478/v10134-010-0033-8
Perception of dynamic facial emotional expressions in adolescents with autism spectrum disorders (ASD)
  • Jan 1, 2010
  • Translational Neuroscience
  • Roy Kessels + 2 more

Previous studies have shown deficits in the perception of static emotional facial expressions in individuals with autism spectrum disorders (ASD), but results are inconclusive. Possibly, using dynamic facial stimuli expressing emotions at different levels of intensities may produce more robust results, since these resemble the expression of emotions in daily life to a greater extent. 30 Young adolescents with high-functioning ASD (IQ>85) and 30 age- and intelligence-matched controls (ages between 12 and 15) performed the Emotion Recognition Task, in which morphs were presented on a computer screen, depicting facial expressions of the six basic emotions (happiness, disgust, fear, anger, surprise and sadness) at nine levels of emotional intensity (20–100%). The results showed no overall group difference on the ERT, apart from a slightly worse performance on the perception of the emotions fear (p<0.03) and disgust (p<0.05). No interaction was found between intensity level of the emotions and group. High-functioning individuals with ASD perform similar to matched controls on the perception of dynamic facial emotional expressions, even at low intensities of emotional expression. These findings are in agreement with other recent studies showing that emotion perception deficits in high-functioning ASD may be less pronounced than previously thought.

  • PDF Download Icon
  • Supplementary Content
  • Cite Count Icon 329
  • 10.3389/fpsyg.2012.00471
Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing
  • Nov 2, 2012
  • Frontiers in Psychology
  • Matthias J Wieser + 1 more

Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.

  • Research Article
  • Cite Count Icon 11
  • 10.1002/cav.1539
Asymmetric facial expressions: revealing richer emotions for embodied conversational agents
  • Jul 19, 2013
  • Computer Animation and Virtual Worlds
  • Junghyun Ahn + 3 more

ABSTRACTIn this paper, we propose a method to achieve effective facial emotional expressivity for embodied conversational agents by considering two types of asymmetry when exploiting the valence–arousal–dominance representation of emotions. Indeed, the asymmetry of facial expressions helps to convey complex emotional feelings such as conflicting and/or hidden emotions due to social conventions. To achieve such a higher degree of facial expression in a generic way, we propose a new model for mapping the valence–arousal–dominance emotion model onto a set of 12 scalar facial part actions built mostly by combining pairs of antagonist action units from the Facial Action Coding System. The proposed linear model can automatically drive a large number of autonomous virtual humans or support the interactive design of complex facial expressions over time. By design, our approach produces symmetric facial expressions, as expected for most of the emotional spectrum. However, more complex ambivalent feelings can be produced when differing emotions are applied on the left and right sides of the face. We conducted an experiment on static images produced by our approach to compare the expressive power of symmetric and asymmetric facial expressions for a set of eight basic and complex emotions. Results confirm both the pertinence of our general mapping for expressing basic emotions and the significant improvement brought by asymmetry for expressing ambivalent feelings. Copyright © 2013 John Wiley & Sons, Ltd.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 40
  • 10.3389/fneur.2016.00230
Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson’s Disease
  • Dec 14, 2016
  • Frontiers in Neurology
  • Matteo Bologna + 6 more

Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all Ps < 0.05). Patients also yielded worse Ekman global score and disgust, sadness, and fear sub-scores than healthy controls (all Ps < 0.001). Altered facial expression kinematics and emotion recognition deficits were unrelated in patients (all Ps > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all Ps > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  • Research Article
  • Cite Count Icon 218
  • 10.1016/s0925-4927(98)00036-5
Investigation of facial recognition memory and happy and sad facial expression perception: an fMRI study
  • Sep 1, 1998
  • Psychiatry Research: Neuroimaging
  • Mary L Phillips + 9 more

Investigation of facial recognition memory and happy and sad facial expression perception: an fMRI study

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 134
  • 10.1186/s13229-016-0113-9
Basic and complex emotion recognition in children with autism: cross-cultural findings.
  • Dec 1, 2016
  • Molecular Autism
  • Shimrit Fridenson-Hayo + 7 more

BackgroundChildren with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden.MethodsFifty-five children with high-functioning ASC, aged 5–9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context.ResultsCompared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks.ConclusionsOur findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.

  • Research Article
  • Cite Count Icon 31
  • 10.2466/pms.1999.89.3.763
College students' perception of facial expressions.
  • Dec 1, 1999
  • Perceptual and Motor Skills
  • Cathy W Hall + 2 more

95 college students were administered the Facial Expressions subtest of the Diagnostic Analysis of Nonverbal Accuracy to measure perception of nonverbal cues. Participants also completed the Nowicki-Strickland Locus of Control Scale and responded to a short questionnaire regarding their beliefs about their own ability to perceive nonverbal cues as well as how effective they felt others were in perceiving nonverbal cues. A significant correlation between locus of control and perception of adult facial expressions indicated those students with a more internal locus of control had higher scores on correct perception of adult facial expression. There was no significant correlation between locus of control and facial expressions of children. Sex differences were also found in perception of nonverbal cues. Female students scored higher in correctly perceiving facial expressions than the men. Participants also scored higher in correctly perceiving facial expressions of children than of adults.

  • Research Article
  • Cite Count Icon 11
  • 10.1162/jocn_a_01445
Out of Focus: Facial Feedback Manipulation Modulates Automatic Processing of Unattended Emotional Faces.
  • Jul 5, 2019
  • Journal of Cognitive Neuroscience
  • Maria Kuehne + 4 more

Facial expressions provide information about an individual's intentions and emotions and are thus an important medium for nonverbal communication. Theories of embodied cognition assume that facial mimicry and resulting facial feedback plays an important role in the perception of facial emotional expressions. Although behavioral and electrophysiological studies have confirmed the influence of facial feedback on the perception of facial emotional expressions, the influence of facial feedback on the automatic processing of such stimuli is largely unexplored. The automatic processing of unattended facial expressions can be investigated by visual expression-related MMN. The expression-related MMN reflects a differential ERP of automatic detection of emotional changes elicited by rarely presented facial expressions (deviants) among frequently presented facial expressions (standards). In this study, we investigated the impact of facial feedback on the automatic processing of facial expressions. For this purpose, participants (n = 19) performed a centrally presented visual detection task while neutral (standard), happy, and sad faces (deviants) were presented peripherally. During the task, facial feedback was manipulated by different pen holding conditions (holding the pen with teeth, lips, or nondominant hand). Our results indicate that automatic processing of facial expressions is influenced and thus dependent on the own facial feedback.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 42
  • 10.3389/frobt.2023.1271610
Real-time emotion generation in human-robot dialogue using large language models.
  • Dec 1, 2023
  • Frontiers in Robotics and AI
  • Chinmaya Mishra + 3 more

Affective behaviors enable social robots to not only establish better connections with humans but also serve as a tool for the robots to express their internal states. It has been well established that emotions are important to signal understanding in Human-Robot Interaction (HRI). This work aims to harness the power of Large Language Models (LLM) and proposes an approach to control the affective behavior of robots. By interpreting emotion appraisal as an Emotion Recognition in Conversation (ERC) tasks, we used GPT-3.5 to predict the emotion of a robot's turn in real-time, using the dialogue history of the ongoing conversation. The robot signaled the predicted emotion using facial expressions. The model was evaluated in a within-subjects user study (N = 47) where the model-driven emotion generation was compared against conditions where the robot did not display any emotions and where it displayed incongruent emotions. The participants interacted with the robot by playing a card sorting game that was specifically designed to evoke emotions. The results indicated that the emotions were reliably generated by the LLM and the participants were able to perceive the robot's emotions. It was found that the robot expressing congruent model-driven facial emotion expressions were perceived to be significantly more human-like, emotionally appropriate, and elicit a more positive impression. Participants also scored significantly better in the card sorting game when the robot displayed congruent facial expressions. From a technical perspective, the study shows that LLMs can be used to control the affective behavior of robots reliably in real-time. Additionally, our results could be used in devising novel human-robot interactions, making robots more effective in roles where emotional interaction is important, such as therapy, companionship, or customer service.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.