Abstract

This paper aims to introduce an algorithm for solving large-scale regularized least-squares problems subject to quadratic inequality constraints. The algorithm recasts the least-squares problem in terms of a parameterized eigenvalue problem. Only two of the smallest eigenpairs of the parameterized problem need to be computed to find the optimal solution from the parameterized eigenvector. Safeguards are introduced to adjust the parameter and a two-point interpolating scheme is developed for updating it. A local convergence theory for this algorithm is presented. It is shown that this algorithm is superlinearly convergent.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call