Abstract
We propose and analyze a new tool to help solve sparse linear least-squares problems $\min_{x}\|Ax-b\|_{2}$. Our method is based on a sparse $QR$ factorization of a low-rank perturbation $\hat{A}$ of $A$. More precisely, we show that the $R$ factor of $\hat{A}$ is an effective preconditioner for the least-squares problem $\min_{x}\|Ax-b\|_{2}$, when solved using LSQR. We propose applications for the new technique. When $A$ is rank deficient, we can add rows to ensure that the preconditioner is well conditioned without column pivoting. When $A$ is sparse except for a few dense rows, we can drop these dense rows from $A$ to obtain $\hat{A}$. Another application is solving an updated or downdated problem. If $R$ is a good preconditioner for the original problem $A$, it is a good preconditioner for the updated/downdated problem $\hat{A}$. We can also solve what-if scenarios, where we want to find the solution if a column of the original matrix is changed/removed. We present a spectral theory that analyzes the generalized spectrum of the pencil $(A^{*}A,R^{*}R)$ and analyze the applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.