Abstract

P300-based concealed information test is a well-known approach for detection of deception. Previous studies only used the parietal channel, Pz, or at most three midline frontal, central, and parietal electrodes (Fz, Cz, and Pz); however, various observations indicate that the P300 component is distributed over a wide area on the scalp. In this paper, we propose a discriminative sparse representation model that effectively makes use of the multichannel concealed information test data. More specifically, a discriminative spatial filter is incorporated into the sparse model. Based on the proposed objective function, the elements of the model (dictionary, spatial filter and sparse code) are iteratively updated such that the discrimination power of the multichannel sparse model is improved. Furthermore, the effect of sparsity constraint on the spatial filter coefficients was investigated. In order to evaluate the performance of the multichannel model in distinguishing guilty subjects from innocent ones, 44 subjects went through a mock crime scenario, and subsequently their electroencephalography signals were recorded using eight electrodes. Experimental results show that the multichannel discriminative dictionary and spatial filter learning method was superior to other multichannel methods allowing classification of guilty and innocent participants with 95% accuracy. Moreover, the learned spatial filters are consistent with the physiological evidence regarding the distribution of P300 subcomponents over the scalp.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.