Abstract

The following case study was carried out on a sample of one experimental and one control group. The participants of the experimental group watched the movie section from the standardized LATEMO-E database via virtual reality (VR) on Oculus Rift S and HTC Vive Pro devices. In the control group, the movie section was displayed on the LCD monitor. The movie section was categorized according to Ekman's and Russell's classification model of evoking an emotional state. The range of valence and arousal was determined in both observed groups. Valence and arousal were measured in each group using a Self-Assessment Manikin (SAM). The control group was captured by a camera and evaluated by Affdex software from Affectiva in order to compare valence values. The control group showed a very high correlation (0.92) between SAM and Affdex results. Having considered the Affdex results as a reference value, it can be concluded that SAM participants evaluated their emotions objectively. The results from both groups show that the movie section is supposed to evoke negative emotion. Negative emotion was perceived more intensely than its counterpart, positive emotion. Using virtual reality to evoke negative emotion (anger) has confirmed that VR triggers a significantly stronger intensity of emotion than LCD.

Highlights

  • Psychology but other scientific disciplines have placed the emotional state in their focus of inquiry

  • H2: There is a statistically significant difference between the intensity of emotion evoked by virtual reality and the monitor

  • A Case Study of Facial Emotion Classification Using Affdex (Magdin et al 2019a) we pointed out the lack of Affdex

Read more

Summary

Introduction

Psychology but other scientific disciplines have placed the emotional state in their focus of inquiry. Emotional state can play a significant role in such areas as education, driving a motor vehicle, health care, but in case of smart homes as well. Current automatic recognition systems are based on three basic phases: user detection, extraction of areas of interest and subsequent classification of emotional state. We can apply the following methods to the classification of the emotional state by the help of the automatic recognition system (Magdin et al 2019a; MarínMorales et al 2018): 1. Face capturing and areas of interest using a camera, 2. Placement of sensors on different parts of the body (e.g. Galvanic Skin Response—GSR, Heart Rate—HR, Temperature, Electroencephalography—EEG) and processing the respective data 3.

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call