Abstract

BackgroundElectrophysiological recordings contain mixtures of signals from distinct neural sources, impeding a straightforward interpretation of the sensor-level data. This mixing is particularly detrimental when distinct sources resonate in overlapping frequencies. Fortunately, the mixing is linear and instantaneous. Multivariate source separation methods may therefore successfully separate statistical sources, even with overlapping spatial distributions. New MethodWe demonstrate a feature-guided multivariate source separation method that is tuned to narrowband frequency content as well as binary condition differences. This method — comparison scanning generalized eigendecomposition, csGED — harnesses the covariance structure of multichannel data to find directions (i.e., eigenvectors) that maximally separate two subsets of data. To drive condition specificity and frequency specificity, our data subsets were taken from different task conditions and narrowband-filtered prior to applying GED. ResultsTo validate the method, we simulated MEG data in two conditions with shared noise characteristics and unique signal. csGED outperformed the best sensor at reconstructing the ground truth signals, even in the presence of large amounts of noise. We next applied csGED to a published empirical MEG dataset on visual perception vs. imagery. csGED identified sources in alpha, beta, and gamma bands, and successfully separated distinct networks in the same frequency band. Comparison with Existing Method(s)GED is a flexible feature-guided decomposition method that has previously successfully been applied. Our combined frequency- and condition-tuning is a novel adaptation that extends the power of GED in cognitive electrophysiology. ConclusionsWe demonstrate successful condition-specific source separation by applying csGED to simulated and empirical data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.