Abstract

Multiview learning aims to learn beneficial patterns from heterogeneous data sources and has captured growing attention in recent years. Most of the previous research studies focused on searching for an effective feature embedding of downstream tasks using diverse optimization algorithms, however, very limited work has been conducted to explore the connection between multiview learning and deep neural networks of structure sharing hidden layers. In this article, we propose a multiview deep matrix factorization model to learn a shared compact representation from multiview data. First, the proposed model constructs a multiview auto-encoder architecture with one shared encoder and multiple decoders, where each view corresponds to a factorization and the shared encoder leads to a common hidden layer. Accordingly, matrix factorizations from multiview data share the last hidden layer for a high-level semantic representation. Second, the nonnegativity constraint of the learned representation is transformed to the projection operation, which can be easily achieved by activating weights of the shared encoder network. Third, this network is trained with a joint loss of the reconstruction error and the compactness loss. By employing the clustering layer, the proposed method serves as an end-to-end multiview clustering method. Finally, comprehensive experiments on nine real-world datasets demonstrate the superiority of the proposed method against state-of-the-art multiview clustering methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.