Abstract
Emotion electroencephalography (EEG) datasets play a significant role in EEG-based emotion recognition research, providing a platform for comparisons of different emotion recognition methods. Most datasets used 2D images or videos as mood induction procedures (MIPs); however, considering the differences in EEG dynamics between 2D and 3D environments, experimental research based on 2D MIPs may have poor results being applied in a real 3D world. In this paper, we (1) developed a new emotion EEG dataset, virtual reality (VR) emotional EEG dataset (VREED), which used 3D VR videos as MIPs; and (2) Presented a baseline for the performance of negative/positive emotion classification in the new dataset. The best average accuracy of 73.77% ± 2.01% was obtained by using the combination of theta (4–8 Hz), alpha (8–13 Hz), beta (13–30 Hz), and gamma (30–49 Hz) relative power features. Additionally, we observed that occipital and frontal regions played a more critical role than other regions in emotion processing from Spearman correlation analysis and feature selection. This new VR emotion EEG dataset will be publicly available, and we encourage other researchers to evaluate their emotion classification methods on the VREED.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.