Abstract
As a result of stimulating the basic human senses, emotional states occur in humans. Of these senses, the visual sense is the most basic human sense. This sense perceives visual stimuli and elicits emotional states. Augmented Reality (AR) applications also work with these visual stimuli. This study investigates how AR systems, which are among immersive environments, are effective in distinguishing the emotional states of students in book reading activities, with the support of Electroencephalography (EEG). The BOOKAR dataset obtained within the scope of this study is among the first AR-supported datasets in emotion recognition using physiological signals with immersive methods. To reveal the emotional states of the readers, texts that stimulate emotional states such as disgusting, happy, neutral, and 2-dimensional pictures are presented within these texts. In the AR-based reading section, the 3-dimensional models of these 2-dimensional pictures and the rig-processed conditions of these models are presented as a stimulus to the reader. The results show that AR-based reading has a significant discriminatory effect, especially on the valence-arousal emotional states of readers, and achieves higher classification performance than real reading. The results also show that the proposed method is good at classifying emotional states from EEG signals with accuracy scores close to 100%. It has been observed that the designed AR application also meets the usability characteristics. The proposed emotion recognition method in AR applications has significant potential for integration into various Metaverse-based applications.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have