Abstract

Dim target detection in remote sensing images is a significant and challenging problem. In this work, we seek to explore event-related brain responses of dim target detection tasks and extend the brain-computer interface (BCI) systems to this task for efficiency enhancement. We develop a BCI paradigm named Asynchronous Visual Evoked Paradigm (AVEP), in which subjects are required to search the dim targets within satellite images when their scalp electroencephalography (EEG) signals are simultaneously recorded. In the paradigm, stimulus onset time and target onset time are asynchronous because subjects need enough time to confirm whether there are targets of interest in the presented serial images. We further propose a Domain adaptive and Channel-wise attention-based Time-domain Convolutional Neural Network (DC-tCNN) to solve the single-trial EEG classification problem for the AVEP task. In this model, we design a multi-scale CNN module combined with a channel-wise attention module to effectively extract event-related brain responses underlying EEG signals. Meanwhile, domain adaptation is proposed to mitigate cross-subject distribution discrepancy. The results demonstrate the superior performance and better generalizability of this model in classifying the single-trial EEG data of AVEP task in contrast to typical EEG deep learning networks. Visualization analyses of spatiotemporal features also illustrate the effectiveness and interpretability of our proposed paradigm and learning model. The proposed paradigm and model can effectively explore ambiguous event-related brain responses on EEG-based dim target detection tasks. Our work can provide a valuable reference for BCI-based image detection of dim targets.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.