Abstract

General graph neural networks (GNNs) implement convolution operations on graphs based on polynomial spectral filters. Existing filters with high-order polynomial approximations can detect more structural information when reaching high-order neighborhoods but produce indistinguishable representations of nodes, which indicates their inefficiency of processing information in high-order neighborhoods, resulting in performance degradation. In this article, we theoretically identify the feasibility of avoiding this problem and attribute it to overfitting polynomial coefficients. To cope with it, the coefficients are restricted in two steps, dimensionality reduction of the coefficients' domain and sequential assignment of the forgetting factor. We transform the optimization of coefficients to the tuning of a hyperparameter and propose a flexible spectral-domain graph filter, which significantly reduces the memory demand and the adverse impacts on message transmission under large receptive fields. Utilizing our filter, the performance of GNNs is improved significantly in large receptive fields and the receptive fields of GNNs are multiplied as well. Meanwhile, the superiority of applying a high-order approximation is verified across various datasets, notably in strongly hyperbolic datasets. Codes are publicly available at: https://github.com/cengzeyuan/TNNLS-FFKSF.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call