Abstract

With the changes in the interface paradigm, a user may not be satisfied using only a behavior-based interface such as a mouse and keyboard. In this paper, we propose a real-time user interface with emotion recognition that depends on the need for skill development to support a change in the interface paradigm to one that is more human centered. The proposed emotion recognition technology may provide services to meet the need to recognize emotions when using contents. Until now, most studies on an emotion recognition interface have used a single signal, which was difficult to apply because of low accuracy. In this study, we developed a complex biological signal emotion recognition system that blends the ratio of an ECG for the autonomic nervous system and the relative power value of an EEG (theta, alpha, beta, and gamma) to improve the low accuracy. The system creates a data map that stores user-specific probabilities to recognize six kinds of feelings (amusement, fear, sadness, joy, anger, and disgust). It updates the weights to improve the accuracy of the emotion corresponding to the brain waves of each channel. In addition, we compared the results of the complex biological signal data set and EEG data set to verify the accuracy of the complex biological signal, and found that the accuracy had increased by 35.78%. The proposed system will be utilized as an interface for controlling a game and smart space for a user with high accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call