Abstract

There is little research on how Virtual Reality (VR) applications can identify and respond meaningfully to users' emotional changes. In this paper, we investigate the impact of Context-Aware Empathic VR (CAEVR) on the emotional and cognitive aspects of user experience in VR. We developed a real-time emotion prediction model using electroencephalography (EEG), electrodermal activity (EDA), and heart rate variability (HRV) and used this in personalized and generalized models for emotion recognition. We then explored the application of this model in a context-aware empathic (CAE) virtual agent and an emotion-adaptive (EA) VR environment. We found a significant increase in positive emotions, cognitive load, and empathy toward the CAE agent, suggesting the potential of CAEVR environments to refine user-agent interactions. We identify lessons learned from this study and directions for future work.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call