The complementary properties of both modalities can be exploited through the fusion of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) data. Thus, a joint analysis of both modalities can be used in brain studies to estimate brain activity's shared and unshared components. This study introduces a comprehensive approach for jointly analyzing EEG and fMRI data using the advanced coupled matrix tensor factorization (ACMTF) method. The similarity of the components based on normalized mutual information (NMI) was defined to overcome the restrictive equality assumption of shared components in the common dimension of the ACMTF method. Because the mutual information (MI) measure can identify both linear and nonlinear relationships between the components, the proposed method can be viewed as a generalization of the ACMTF method; thus, it is called the generalized coupled matrix tensor factorization (GCMTF). The proposed GCMTF method was applied to simulated data, in which the components exhibited a nonlinear relationship. The results demonstrate that the average match score increased by 23.46% compared with the ACMTF model, even with different noise levels. Furthermore, applying this method to real data from an auditory oddball paradigm demonstrated that three shared components with frequency responses in the alpha and theta bands were identified. The proposed MI-based method cannot only extract shared components with any nonlinear or linear relationship but can also identify more active brain areas corresponding to an auditory oddball paradigm compared to ACMTF and other similar methods.
Read full abstract