Abstract

AbstractThe PageRank model, which was first proposed by Google for its web search engine application, has since become a popular computational tool in a wide range of scientific fields, including chemistry, bioinformatics, neuroscience, bibliometrics, social networks, and others. PageRank calculations necessitate the use of fast computational techniques with low algorithmic and memory complexity. In recent years, much attention has been paid to Krylov subspace algorithms for solving difficult PageRank linear systems, such as those with large damping parameters close to one. In this article, we examine the full orthogonalization method (FOM). We present a convergence study of the method that extends and clarifies part of the conclusions reached in Zhang et al. (J Comput Appl Math. 2016; 296:397–409.). Furthermore, we demonstrate that FOM is breakdown free when solving singular PageRank linear systems with index one and we investigate the effect of using weighted inner‐products instead of conventional inner‐products in the orthonormalization procedure on FOM convergence. Finally, we develop a shifted polynomial preconditioner that takes advantage of the special structure of the PageRank linear system and has a good ability to cluster most of the eigenvalues, making it a good choice for an iterative method like FOM or GMRES. Numerical experiments are presented to support the theoretical findings and to evaluate the performance of the new weighted preconditioned FOM PageRank solver in comparison to other established solvers for this class of problem, including conventional stationary methods, hybrid combinations of stationary and Krylov subspace methods, and multi‐step splitting strategies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call