Abstract

As the pace of life becomes faster and faster, people with weaker mental capacity are prone to a range of negative emotions in certain environments, such as mania and anxiety, and these negative emotions can interfere with normal learning and life, and even affect physical and mental health. Therefore, it is more and more important to accurately analyze people's emotional state.In this paper, based on the bimodal signals of EEG and EMG, we combine the broad learning network and D-S evidence theory to do an emotion recognition analysis of bioelectrical signals evoked by visual stimuli. Firstly, a suitable experimental paradigm is designed to acquire EEG signals and EMG signals through emotion evocation, followed by pre-processing, decomposing the signals using a wavelet decomposition algorithm, extracting signal components significantly affected by emotional states, and calculating the corresponding differential entropy as well as power spectrum features; after that, EMG features in time and frequency domains are extracted, and different features are fused and analyzed; BLS is used for classification, and finally, the BLS classification The results are fused with the introduced D-S evidence theory at the decision level. The method can recognize emotions effectively with an accuracy of 85.2% in triple classification scenarios. The proposed emotion recognition model is expected to provide a new technical tool for emotional cognitive science research and emotional brain-computer interface systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call