Abstract

Facial emotional recognition is something used often in our daily lives. How does the brain process the face search? Can taste modify such a process? This study employed two tastes (sweet and acidic) to investigate the cross-modal interaction between taste and emotional face recognition. The behavior responses (reaction time and correct response ratios) and the event-related potential (ERP) were applied to analyze the interaction between taste and face processing. Behavior data showed that when detecting a negative target face with a positive face as a distractor, the participants perform the task faster with an acidic taste than with sweet. No interaction effect was observed with correct response ratio analysis. The early (P1, N170) and mid-stage [early posterior negativity (EPN)] components have shown that sweet and acidic tastes modified the ERP components with the affective face search process in the ERP results. No interaction effect was observed in the late-stage (LPP) component. Our data have extended the understanding of the cross-modal mechanism and provided electrophysiological evidence that affective facial processing could be influenced by sweet and acidic tastes.

Highlights

  • Facial expressions play a significant role in social situations, and they become even more intriguing during fine dining

  • Our results have shown that gustatory stimuli can influence affective facial processing at behavioral and neural levels

  • With temporal dynamic event-related potential (ERP) analysis, the significant interaction effects between the emotional face and the taste have been observed with P1, N170, and early posterior negativity (EPN)

Read more

Summary

Introduction

Facial expressions play a significant role in social situations, and they become even more intriguing during fine dining. Much so that one of the critical social skills for high table culture is detecting others’ emotions while dining. How does taste influence facial expression by affecting emotion? Emotional face detection is a well-established model to study cross-modal sensory integration. Different emotional facial processing has been studied extensively (Schindler and Bublatzky, 2020). Many event-related potential (ERP) studies have shown that there are different ERP components, which involve early (P1, N170), mid-latency (Early Posterior Negativity, EPN), and late (Late Positive Potential, LPP) stages of the emotional facial processing (Xia et al, 2014; Schindler and Bublatzky, 2020). The P1 component represents the early electrocortical processing of facial information, and the emotional modulation has been inconsistent with different

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.