Abstract

We extend the geometrical inverse approximation approach to the linear least-squares scenario. For that, we focus on the minimization of $1-\cos \limits (X(A^{T}A),I)$ , where A is a full-rank matrix of size m × n, with m ≥ n, and X is an approximation of the inverse of ATA. In particular, we adapt the recently published simplified gradient-type iterative scheme MinCos to the least-squares problem. In addition, we combine the generated convergent sequence of matrices with well-known acceleration strategies based on recently developed matrix extrapolation methods, and also with some line search acceleration schemes which are based on selecting an appropriate steplength at each iteration. A set of numerical experiments, including large-scale problems, are presented to illustrate the performance of the different accelerations strategies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call