Abstract

Transfer learning, which utilizes labeled source domains to facilitate the learning in a target model, is effective in alleviating high intra- and inter-subject variations in electroencephalogram (EEG) based brain-computer interfaces (BCIs). Existing transfer learning approaches usually use the source subjects' EEG data directly, leading to privacy concerns. This paper considers a decentralized privacy-preserving transfer learning scenario: there are multiple source subjects, whose data and computations are kept local, and only the parameters or predictions of their pre-trained models can be accessed for privacy-protection; then, how to perform effective cross-subject transfer for a new subject with unlabeled EEG trials? We propose an offline unsupervised multi-source decentralized transfer (MSDT) approach, which first generates a pre-trained model from each source subject, and then performs decentralized transfer using the source model parameters (in gray-box settings) or predictions (in black-box settings). Experiments on two datasets from two BCI paradigms, motor imagery and affective BCI, demonstrated that MSDT outperformed several existing approaches, which do not consider privacy-protection at all. In other words, MSDT achieved both high privacy-protection and better classification performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call