Abstract

Contrastive learning has been widely used in graph representation learning, which extracts node or graph representations by contrasting positive and negative node pairs. It requires node representations (embeddings) to reflect their correlations in topology, increasing the similarities between an anchor node and its positive nodes, or reducing the similarities with its negative nodes in embedding space. However, most existing contrastive models measure similarities through a fixed metric that equally scores all sample pairs in a specific feature space, but ignores the varieties of node attributes and network topologies. Moreover, these fixed metrics are always defined explicitly and manually, which makes them unsuitable for applying to all graphs and networks. To solve these problems, we propose a novel graph representation learning model with an adaptive metric, called GRAM, which produces appropriate similarity scores of node pairs according to the different significance of each dimension in their embedding vectors and adaptive metrics based on data distribution. With these scores, it is better to train a graph encoder and obtain representative embeddings. Experimental results show that GRAM has strong competitiveness in multiple tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.