Abstract

The multi-modal imbalanced data learning problems are becoming increasingly common in the real world, especially in brain disease diagnosis. Although multi-modal data provides complementary information for decision making, it can also lead the model to be more sensitive to the adverse effects of imbalance. The existing imbalance learning methods are mostly based on single modality. In this paper, we design a cognitive driven ordinal preservation model optimized in both feature and sample levels for multi-modal imbalanced data. At the feature level, we project the original data into the label space by a second-order Laplacian manifold for better capturing the minor changes and preserving the discriminative information among samples. At the sample level, we derive the class-specific self-paced learning, simulating human cognitive mechanism, to drive data participating in learning from balance subset to the whole set, which can reduce the negative effects of imbalance on the learning model. Meanwhile, we impose the group sparse constraint on projection matrix to embed the latent relationship pattern among different modalities, and theoretically prove its convergence. The proposed method is applied to multi-modal brain disease diagnosis, including Alzheimer’s disease (AD) and epilepsy. The experimental results show that our method outperforms the existing imbalance and fusion algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.