Abstract

Biometric signals have been extensively used for user identification and authentication due to their inherent characteristics that are unique to each person. The variation exhibited between the brain signals (EEG) of different people makes such signals especially suitable for biometric user identification. However, the characteristics of these signals are also influenced by the user's current condition, including his/her affective state. In this paper, we analyze the significance of the affect-related component of brain signals within the subject identification context. Consistent results are obtained across three different public datasets, suggesting that the dominant component of the signal is subject-related, but the affective state also has a contribution that affects identification accuracy. Results show that identification accuracy increases when the system has been trained with EEG recordings that refer to similar affective states as the sample that is to be identified. This improvement holds independently of the features and classification algorithm used, and it is generally above 10 percent under a rigorous setting, when the training and validation datasets do not share data from the same recording days. This finding emphasizes the potential benefits of considering affective information in applications that require subject identification, such as user authentication.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.