Abstract

Cognitive control is often perplexing to elucidate and can be easily influenced by emotions. Understanding the individual cognitive control level is crucial for enhancing VR interaction and designing adaptive and self-correcting VR/AR applications. Emotions can reallocate processing resources and influence cognitive control performance. However, current research has primarily emphasized the impact of emotional valence on cognitive control tasks, neglecting emotional arousal. In this study, we comprehensively investigate the influence of emotions on cognitive control based on the arousal-valence model. A total of 26 participants are recruited, inducing emotions through VR videos with high ecological validity and then performing related cognitive control tasks. Leveraging physiological data including EEG, HRV, and EDA, we employ classification techniques such as SVM, KNN, and deep learning to categorize cognitive control levels. The experiment results demonstrate that high-arousal emotions significantly enhance users' cognitive control abilities. Utilizing complementary information among multi-modal physiological signal features, we achieve an accuracy of 84.52% in distinguishing between high and low cognitive control. Additionally, time-frequency analysis results confirm the existence of neural patterns related to cognitive control, contributing to a better understanding of the neural mechanisms underlying cognitive control in VR. Our research indicates that physiological signals measured from both the central and autonomic nervous systems can be employed for cognitive control classification, paving the way for novel approaches to improve VR/AR interactions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call