The field of VR-EEG affective computing is rapidly progressing. However, it faces challenges such as lacking a solid psychological theory foundation, limited classification accuracy, and high computational costs. This study established a standardized VR video library to elicit emotions. Participants viewed positive, negative, and neutral VR videos while EEG data was collected. Grounded in the Affective Style Theory, this research proposes an emotion valence recognition strategy in VR that balances computational efficiency and classification accuracy through multidimensional complementary feature extraction from EEG signals, feature selection or dimensionality reduction coupled with classifiers, and optimal frequency band selection. The research findings indicate that multidimensional complementary feature extraction in frequency and spatial domains can enhance recognition performance. Notably, the theta frequency band features are pivotal in emotion valence recognition within VR environments. Strategies like PCA-RF and RBFNN outperform existing methods, achieving an average classification accuracy of up to 95.6% while maintaining computational efficiency. In terms of theoretical contributions, the study enhances our understanding of emotional perception consistency and variability under the Affective Style Theory, offering insights into individual emotional state recognition. In practical terms, it emphasizes efficiency-accuracy balance, making integrating VR-EEG affective computation technology into a broader range of applications feasible.
Read full abstract