Graph Neural Networks (GNNs) have emerged as a widely used and effective method across various domains for learning from graph data. Despite the abundance of GNN variants, many struggle with effectively propagating messages over long distances. This paper introduces a novel hierarchical message passing framework for graph learning, specifically designed to address the challenge of long-distance message propagation in graphs. By constructing smaller graphs from the main graph using the concept of domination, a fundamental principle in graph theory, we facilitate more efficient message passing within each subgraph. Subsequently, we employ a Graph Attention Network (GAT) to aggregate these features and propagate them to distant nodes across the graph. Experimental results on standard node classification datasets validate that the proposed architecture achieves performance comparable to or better than conventional GNNs. Additionally, our model consistently performs better on graphs with missing edges.
Read full abstract