Abstract
Brain computer interface (BCI) systems can decode brain affective activities into interpretable features and facilitate emotional human–computer interaction. However, individual differences of neurophysiological responses from BCI subjects constitute a stumbling block in individual-independent emotion recognition. In this study, we propose a new locally-robust feature selection (LRFS) method to determine generalizable features of electroencephalography (EEG) within several subsets of accessible subjects. In the LRFS framework, extracted EEG features are first modeled with probability densities. By evaluating the similarity of all density functions between each two subjects, inter-individual consistency of the EEG features is described. The derived consistency determines locally-robust EEG features, wherein importance of each feature is examined according to margin loss between emotions. To fuse selected features from multiple subsets of subjects, we employ ensemble learning principle and build an emotion classifier committee. Based on public DEAP and MAHNOB-HCI databases, individual-independent classification accuracy of the LRFS-based classifier is achieved by 0.65–0.68 (DEAP) and 0.67–0.70 (MAHNOB-HCI) for arousal and valence domains, respectively. Competitiveness of the LRFS has been validated when compared with several existing feature selection methods and emotion recognition systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.