Abstract

The focus of this paper is on the main challenges in brain-computer interface transfer learning: how to address data characteristic length and the source domain sample selection problems caused by individual differences. To overcome the negative migration that results from feature length, we propose a migration algorithm based on mutual information transfer (MIT), which selects effective features by calculating the entropy value of the probability distribution and conditional distribution, thereby reducing negative migration and improving learning efficiency. Source domain participants who differ too much from the target domain distribution can affect the overall classification performance. On the basis of MIT, we propose the Pearson correlation coefficient source domain automatic selection algorithm (PDAS algorithm). The PDAS algorithm can automatically select the appropriate source domain participants according to the target domain distribution, which reduces the negative migration of participant data among the source domain participants, improves experimental accuracy, and greatly reduces training time. The two proposed algorithms were tested offline and online on two public datasets, and the results were compared with those from existing advanced algorithms. The experimental results showed that the MIT algorithm and the MIT + PDAS algorithm had obvious advantages.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call