The PageRank model is a powerful tool for network analysis, utilized across various disciplines such as web information retrieval, bioinformatics, community detection, and graph neural network. Computing this model requires solving a large-dimensional linear system or eigenvector problem due to the ever-increasing scale of networks. Conventional preconditioners and iterative methods for general linear systems or eigenvector problems often exhibit unsatisfactory performance for such problems, particularly as the damping factor parameter approaches 1, necessitating the development of specialized methods that exploit the specific properties of the PageRank coefficient matrix. Additionally, in practical applications, the optimal settings of the hyperparameters are generally unknown in advance, and networks often evolve over time. Consequently, recomputation of the problem is necessary following minor modifications. In this scenario, highly efficient preconditioners that significantly accelerate the iterative solution at a low memory cost are desirable. In this paper, we present two techniques that leverage the sparsity structures and numerical properties of the PageRank system, as well as a preconditioner based on the computed matrix structure. Experiments demonstrate the positive performance of the proposed methods on realistic PageRank computations.
Read full abstract