Abstract

Affective Computing is one of the central studies for achieving advanced human-computer interaction and is a popular research direction in the field of artificial intelligence for smart healthcare frameworks. In recent years, the use of electroencephalograms (EEGs) to analyze human emotional states has become a hot spot in the field of emotion recognition. However, the EEG is a non-stationary, non-linear signal that is sensitive to interference from other physiological signals and external factors. Traditional emotion recognition methods have limitations in complex algorithm structures and low recognition precision. In this article, based on an in-depth analysis of EEG signals, we have studied emotion recognition methods in the following respects. First, in this study, the DEAP dataset and the excitement model were used, and the original signal was filtered with others. The frequency band was selected using a butter filter and then the data was processed in the same range using min-max normalization. Besides, in this study, we performed hybrid experiments on sash windows and overlays to obtain an optimal combination for the calculation of features. We also apply the Discrete Wave Transform (DWT) to extract those functions from the preprocessed EEG data. Finally, a pre-trained k-Nearest Neighbor (kNN) machine learning model was used in the recognition and classification process and different combinations of DWT and kNN parameters were tested and fitted. After 10-fold cross-validation, the precision reached 86.4%. Compared to state-of-the-art research, this method has higher recognition accuracy than conventional recognition methods, while maintaining a simple structure and high speed of operation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.