Abstract

Heterogeneous information network (HIN) embedding aims to learn the low-dimensional representations of nodes while preserving structures and semantics in HINs. Although most existing methods consider heterogeneous relations and achieve promising performance, they usually employ one single model for all relations without distinction, which inevitably restricts the capability of HIN embedding. In this paper, we argue that heterogeneous relations have different structural characteristics, and propose a novel Relation structure-aware HIN Embedding model, called RHINE. By exploring four real-world networks with thorough analysis, we present two structure-related measures which consistently distinguish heterogeneous relations into two categories: Affiliation Relations (ARs) and Interaction Relations (IRs). To respect the distinctive structural characteristics of relations, in RHINE, we propose different models specifically tailored to handle ARs and IRs, which can better capture the structures in HINs. Finally, we combine and optimize these models in a unified manner. Furthermore, considering that nodes connected via heterogeneous relations may have multi-aspect semantics and each relation focuses on one aspect, we introduce relation-specific projection matrices to learn node and relation embeddings in separate spaces rather than a common space, which can better preserve the semantics in HINs, referring to a new model RHINE-M. Experiments on four real-world datasets demonstrate that our models significantly outperform the state-of-the-art methods in four tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call