Abstract

In recent years, Zero-shot Node Classification (ZNC), an emerging and more difficult task is starting to attract attention, where the classes of testing nodes are unobserved in the training stage. Existing studies for ZNC mainly utilize Graph Neural Networks (GNNs) to construct the feature subspace to align with the classes’ semantic subspace, thus enabling knowledge transfer from seen classes to unseen classes. However, the modeling of the node feature is single-view and unilateral, e.g., the bag-of-words vector, which is not enough to fully describe the characteristics of the node itself. To address this dilemma, we propose to develop the Multi-View Enhanced zero-shot node classification paradigm (MVE) to promote the machine’s generality to approach the human-like thinking mode. Specifically, multi-view features are obtained from different aspects such as pre-trained model embeddings, knowledge graphs, statistic methods, and then fused by a contrastive learning module into the compositional node representation. Meanwhile, a developed Graph Convolutional Network (GCN) is used to make the nodes fully absorb the information of neighbors while the over-smooth issue is alleviated by multi-view features and the proposed contrastive learning mechanism. Experimental results conducted on three public datasets show an average 25% improvement compared to baseline methods, proving the superiority of our multi-view learning framework. The code and data can be found at https://github.com/guaiqihen/MVE.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.