LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares

LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares

Vol 8, No. 1, March 1982 | CHRISTOPHER C. PAIGE, MICHAEL A. SAUNDERS
The paper introduces LSQR, an iterative method for solving large, sparse linear systems and least squares problems. Based on the bidiagonalization procedure of Golub and Kahan, LSQR is analytically equivalent to the conjugate gradient method but offers better numerical properties. The method generates a sequence of approximations that monotonically decrease the residual norm. Reliable stopping criteria and estimates of standard errors for the solution and the condition number of the matrix are derived and incorporated into the FORTRAN implementation, subroutine LSQR. Numerical tests comparing LSQR with other conjugate-gradient algorithms show that LSQR is the most reliable when the matrix is ill-conditioned. The paper also discusses the Lanczos process and its application to solving symmetric linear equations, as well as the bidiagonalization procedures Bidiag 1 and Bidiag 2. It compares LSQR with other methods such as CGLS, Craig's method, LSCG, LSLQ, and RRLS, highlighting their advantages and disadvantages in terms of numerical stability and performance.The paper introduces LSQR, an iterative method for solving large, sparse linear systems and least squares problems. Based on the bidiagonalization procedure of Golub and Kahan, LSQR is analytically equivalent to the conjugate gradient method but offers better numerical properties. The method generates a sequence of approximations that monotonically decrease the residual norm. Reliable stopping criteria and estimates of standard errors for the solution and the condition number of the matrix are derived and incorporated into the FORTRAN implementation, subroutine LSQR. Numerical tests comparing LSQR with other conjugate-gradient algorithms show that LSQR is the most reliable when the matrix is ill-conditioned. The paper also discusses the Lanczos process and its application to solving symmetric linear equations, as well as the bidiagonalization procedures Bidiag 1 and Bidiag 2. It compares LSQR with other methods such as CGLS, Craig's method, LSCG, LSLQ, and RRLS, highlighting their advantages and disadvantages in terms of numerical stability and performance.
Reach us at info@study.space
[slides] LSQR%3A An Algorithm for Sparse Linear Equations and Sparse Least Squares | StudySpace