Abstract

As a powerful tool for elucidating the embedding representation of graph-structured data, Graph Neural Networks (GNNs), which are a series of powerful tools built on homogeneous networks, have been widely used in various data mining tasks. It is a huge challenge to apply a GNN to an embedding Heterogeneous Information Network (HIN). The main reason for this challenge is that HINs contain many different types of nodes and different types of relationships between nodes. HIN contains rich semantic and structural information, which requires a specially designed graph neural network. However, the existing HIN-based graph neural network models rarely consider the interactive information hidden between the meta-paths of HIN in the poor embedding of nodes in the HIN. In this paper, we propose an Attention-aware Heterogeneous graph Neural Network (AHNN) model to effectively extract useful information from HIN and use it to learn the embedding representation of nodes. Specifically, we first use node-level attention to aggregate and update the embedding representation of nodes, and then concatenate the embedding representation of the nodes on different meta-paths. Finally, the semantic-level neural network is proposed to extract the feature interaction relationships on different meta-paths and learn the final embedding of nodes. Experimental results on three widely used datasets showed that the AHNN model could significantly outperform the state-of-the-art models.

Highlights

  • As a powerful tool for elucidating the embedding representation of graph-structured data, Graph Neural Networks (GNNs), which are a series of powerful tools built on homogeneous networks, have been widely used in various data mining tasks

  • Because the model proposed in this paper is an embedding model that utilizes GNN to learn node representation in Heterogeneous Information Network (HIN), we consider the classic representation model DeepWalk[l8], the meta-path-based random walk in HIN, named HERec[l2], the GNN-based methods GCN[5] and Graph Attention neTwork (GAT)[9], and the graph neural learning model Heterogeneous graph Attention Networks (HANs)[l4] in HIN

  • The aware Heterogeneous graph Neural Network (AHNN) model first aggregates the neighboring information of each node under different meta-paths through node-level attention, and concatenates the embeddings on different meta-paths before employing a semantic-level neural network to mine the feature interaction information hidden between different metapaths and different dimensions to learn the final embeddings of nodes

Read more

Summary

Introduction

Real data often contains structural information, for example, graph-structured data widely exist in the fields of chemistry[l,2] , physics[3,4] , and social science[5,6]. There are two problems with semantic-level attention: One is that the corresponding dimensions of node embedding on different meta-paths may represent different aspects of information, the other is that semantic-level attention cannot capture the interactive information between node embeddings on different meta-paths This makes it difficult for HAN to effectively learn high-quality embedded representations in HAN. A semantic-level neural network is better than the use of semantic attention for extracting feature interaction information between node embeddings on different metapaths to learn high-quality embedding representations. We propose a semantic-level neural network to extract feature interaction information hidden between node embeddings on different meta-paths. In this way, the comprehensive and subtle information between metapaths can be fully utilized. The experimental results show that AHNN is superior to the state-of-the-art models

Related Work
HIN embedding
Definition
Proposed Model
Node-level attention
Semantic-level neural network
Datasets
Baselines
Implementation details
Classification
Clustering
Parametric analysis
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call