Abstract
The PageRank model is a powerful tool for network analysis, utilized across various disciplines such as web information retrieval, bioinformatics, community detection, and graph neural network. Computing this model requires solving a large-dimensional linear system or eigenvector problem due to the ever-increasing scale of networks. Conventional preconditioners and iterative methods for general linear systems or eigenvector problems often exhibit unsatisfactory performance for such problems, particularly as the damping factor parameter approaches 1, necessitating the development of specialized methods that exploit the specific properties of the PageRank coefficient matrix. Additionally, in practical applications, the optimal settings of the hyperparameters are generally unknown in advance, and networks often evolve over time. Consequently, recomputation of the problem is necessary following minor modifications. In this scenario, highly efficient preconditioners that significantly accelerate the iterative solution at a low memory cost are desirable. In this paper, we present two techniques that leverage the sparsity structures and numerical properties of the PageRank system, as well as a preconditioner based on the computed matrix structure. Experiments demonstrate the positive performance of the proposed methods on realistic PageRank computations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.