Abstract

The empirical studies of most existing graph neural networks (GNNs) broadly take the original node feature and adjacency relationship as single-channel input, ignoring the rich information of multiple graph channels. To circumvent this issue, the multichannel graph analysis framework has been developed to fuse graph information across channels. How to model and integrate shared (i.e., consistency) and channel-specific (i.e., complementarity) information is a key issue in multichannel graph analysis. In this article, we propose a cross-channel graph information bottleneck (CCGIB) principle to maximize the agreement for common representations and the disagreement for channel-specific representations. Under this principle, we formulate the consistency and complementarity information bottleneck (IB) objectives. To enable optimization, a viable approach involves deriving variational lower bound and variational upper bound (VarUB) of mutual information terms, subsequently focusing on optimizing these variational bounds to find the approximate solutions. However, obtaining the lower bounds of cross-channel mutual information objectives proves challenging through direct utilization of variational approximation, primarily due to the independence of the distributions. To address this challenge, we leverage the inherent property of joint distributions and subsequently derive variational bounds to effectively optimize these information objectives. Extensive experiments on graph benchmark datasets demonstrate the superior effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call