Abstract. Emotion recognition is a branch of artificial intelligence that analyzes human emotional states through facial expressions, voice, or physiological signals. It enhances human-computer interaction, facilitating more personalized and empathetic technology experiences, crucial for fields like mental health, customer service, and human-robot interaction. In recent years, research on emotion recognition using these tools has grown rapidly, involving multiple interdisciplinary fields. With the aid of electroencephalogram (EEG)-based brain-computer interfaces (BCIs), the emotional states of users can be sensed and analyzed. It offers a direct, non-intrusive insight into user emotions, enhancing user experience and system responsiveness. This approach is crucial for developing adaptive artificial intelligence (AI) in fields like healthcare for personalized treatments and in entertainment for immersive experiences, advancing human-technology symbiosis. This paper compares five current machine learning (ML)-based emotion recognition methods leveraging EEG signals, aiming to evaluate their effectiveness and applicability in emotion recognition. The paper concludes that while both Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) have their strengths, the combination of them provides the best performance in EEG-based emotion recognition.
Read full abstract