Abstract

Emotional state recognition is an important part of emotional research. Compared to non-physiological signals, the electroencephalogram (EEG) signals can truly and objectively reflect a person’s emotional state. To explore the multi-frequency band emotional information and address the noise problem of EEG signals, this paper proposes a robust multi-frequency band joint dictionary learning with low-rank representation (RMBDLL). Based on the dictionary learning, the technologies of sparse and low-rank representation are jointly integrated to reveal the intrinsic connections and discriminative information of EEG multi-frequency band. RMBDLL consists of robust dictionary learning and intra-class/inter-class local constraint learning. In robust dictionary learning part, RMBDLL separates complex noise in EEG signals and establishes clean sub-dictionaries on each frequency band to improve the robustness of the model. In this case, different frequency data obtains the same encoding coefficients according to the consistency of emotional state recognition. In intra-class/inter-class local constraint learning part, RMBDLL introduces a regularization term composed of intra-class and inter-class local constraints, which are constructed from the local structural information of dictionary atoms, resulting in intra-class similarity and inter-class difference of EEG multi-frequency bands. The effectiveness of RMBDLL is verified on the SEED dataset with different noises. The experimental results show that the RMBDLL algorithm can maintain the discriminative local structure in the training samples and achieve good recognition performance on noisy EEG emotion datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call