Abstract

Over the past few years, multimodal data analysis has emerged as an inevitable method for identifying sample categories. In the multi-view data classification problem, it is expected that the joint representation should include the supervised information of sample categories so that the similarity in the latent space implies the similarity in the corresponding concepts. Since each view has different statistical properties, the joint representation should be able to encapsulate the underlying nonlinear data distribution of the given observations. Another important aspect is the coherent knowledge of the multiple views. It is required that the learning objective of the multi-view model efficiently captures the nonlinear correlated structures across different modalities. In this context, this article introduces a novel architecture, termed discriminative deep canonical correlation analysis (D2CCA), for classifying given observations into multiple categories. The learning objective of the proposed architecture includes the merits of generative models to identify the underlying probability distribution of the given observations. In order to improve the discriminative ability of the proposed architecture, the supervised information is incorporated into the learning objective of the proposed model. It also enables the architecture to serve as both a feature extractor as well as a classifier. The theory of CCA is integrated with the objective function so that the joint representation of the multi-view data is learned from maximally correlated subspaces. The proposed framework is consolidated with corresponding convergence analysis. The efficacy of the proposed architecture is studied on different domains of applications, namely, object recognition, document classification, multilingual categorization, face recognition, and cancer subtype identification with reference to several state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call