Abstract
As an essential concept in attention, context defines the overall scope under consideration. In attention-based GNNs, context becomes the set of representation nodes of graph embedding. Current approaches choose immediate neighbors of the target or its subset as the context, which limits the ability of attention to capture long-distance dependency. To address this deficiency, we propose a novel attention-based GNN framework with extended contexts. Concretely, multi-hop nodes are first selected for context expansion according to information transferability and the number of hops. Then, to reduce the computational cost and fit the graph representation learning process, two heuristic context refinement policies are designed by focusing on local graph structure. One is for the graphs with high degrees, multi-hop neighbors with fewer connections to the target are removed to acquire accurate diffused information. The other is for the graphs with low degrees or uniform degree distribution, low-transferability neighbors are dislodged to ensure the graph locality is not obscured by the global information induced by the extended context. Finally, multi-head attention is employed in the refined context. Numerical comparisons with 23 baselines demonstrate the superiority of our method. Extensive model analysis shows that extending context with the informative multi-hop neighbors properly indeed promotes the performance of attention-based GNNs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.