Abstract
SUMMARY Key advantages of conjugate gradient (CG) methods are that they require far less computer memory than full singular value decomposition (SVD), and that iteration may be stopped at any time to give an approximate solution; this means that they may be used to obtain solutions of problems that are too large for SVD. The disadvantage is that CG does not conveniently provide auxiliary information on the quality of the solution (resolution and covariance matrices). This may be overcome by extensions of Paige and Saunders’ LSQR algorithm, which is one of the family of CG algorithms. The extensions are produced by analogy with SVD; bidiagonalization in LSQR produces orthonormal basis vectors that can be used to construct solutions and estimates of resolution and covariance. For large problems, for which SVD can not be performed, the new method provides approkimate resolution and covariance estimates that asymptotically approach those of the SVD solutions as the number of iterations increases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.