Abstract

Many current applications depend on human-computer interaction systems for service delivery. These applications need to know the personal features of users, such as their gender, to improve the user experience. Notably, therefore, it is possible to categorize gender based on speech and facial characteristics. Additionally, instead of considering body part features, gender prediction based on physiological measurements such as electroencephalogram (EEG) signals provides a strong and reliable method for an automatic gender classification system. EEG signals record brain activity from the outer scalp, which means they are contaminated by external and internal noise. Therefore, designing an effective gender detection mechanism based on EEG data is challenging because gender-related EEG features must be captured accurately. In this paper, we investigated the effect of emotions on a gender prediction system using EEG data in negative and positive emotional states. We proposed a model that extracts the power spectral density features of EEG signals in emotional states and predicts gender using three classifiers: decision tree, random forest, and multilayer perceptron. Furthermore, we studied the effect of omitting a single electrode and multiple electrodes in the EEG data on the proposed system. The experimental results demonstrate the effectiveness of using emotional state EEG data to identify gender. Furthermore, the random forest classifier achieved the lowest percentage error of 7% based on negative emotion EEG signal. Finally, the results revealed that the brain’s frontal lobe has a high level of success in enabling differentiation between males and females.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.