Abstract

Abstract Emotion recognition can be achieved by speech recognition, the judgment of limb movements, analysis of Electrooculogram (EOG) or capturing of facial expressions. However, those types of emotion recognition methods cannot detect human emotion well, because humankind can use fake body movement and words to hide real emotions. In this paper, we proposed an EEG-based emotion classification method based on Bidirectional Long Short-Term Memory Network (BiLSTM). Electroencephalogram (EEG) signal can detect human emotion correctly because human represent their real emotions in their mind and cannot hide emotions there. Meanwhile, EEG is a time sequence signal which needs a model which can deal with this type of data. Therefore, we chose Long Short-term Memory Network to process the EEG signal. In particular, we used an improvement version of LSTM model BiLSTM to manage the signals. BiLSTM can processes input data from front to back and back to front. Meanwhile, BiLSTM can store important information and forget unnecessary information; therefore, this process increases the accuracy of the model. Our method classifies four discrete classifications (happy, sad, fear, and neutral) for emotion classification, which achieves competitive performance compared with other conventional emotion classification methods. The final experimental results show that we can achieve an accuracy of 84.21% for four emotional states classification by using our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call