Abstract

SUMMARY Key advantages of conjugate gradient (CG) methods are that they require far less computer memory than full singular value decomposition (SVD), and that iteration may be stopped at any time to give an approximate solution; this means that they may be used to obtain solutions of problems that are too large for SVD. The disadvantage is that CG does not conveniently provide auxiliary information on the quality of the solution (resolution and covariance matrices). This may be overcome by extensions of Paige and Saunders’ LSQR algorithm, which is one of the family of CG algorithms. The extensions are produced by analogy with SVD; bidiagonalization in LSQR produces orthonormal basis vectors that can be used to construct solutions and estimates of resolution and covariance. For large problems, for which SVD can not be performed, the new method provides approkimate resolution and covariance estimates that asymptotically approach those of the SVD solutions as the number of iterations increases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call