Abstract

We propose a new algorithm for the optimization of convex functions over a polyhedral set in {mathbb {R}}^n. The algorithm extends the spectral projected-gradient method with limited-memory BFGS iterates restricted to the present face whenever possible. We prove convergence of the algorithm under suitable conditions and apply the algorithm to solve the Lasso problem, and consequently, the basis-pursuit denoise problem through the root-finding framework proposed by van den Berg and Friedlander (SIAM J Sci Comput 31(2):890–912, 2008). The algorithm is especially well suited to simple domains and could also be used to solve bound-constrained problems as well as problems restricted to the simplex.

Highlights

  • In this paper we propose an algorithm for optimization problems of the form minimize f (x) subject to x ∈ C, (1)x where f : Rn → R is a convex, twice continuously differentiable function, and C is a polyhedral set in Rn

  • As this work was motivated by improving the Lasso problem, we focus on the weighted one-norm ball: Cw,1 = {x ∈ Rn | x w,1 ≤ τ }, where x w,1 := i wi |xi | positive wi

  • The first thing to notice is that the runtime of qpOASES is insensitive to the value of τ, whereas the runtime for spg and the hybrid method increases with τ

Read more

Summary

Introduction

In this paper we propose an algorithm for optimization problems of the form minimize f (x) subject to x ∈ C,. Ax − b 2 subject to x 1 ≤ τ, and that solving (BPσ ) can be reduced to finding the smallest τ for which the Lasso solution xτ∗ satisfies Axτ∗ − b ≤ σ Denoting by τσ this critical value of τ and assuming that b lies in the range space of A it was shown in [2] that the Pareto curve is convex and differentiable at all τ ∈ [0, τ0) with gradient AT r ∞/ r 2 where r denotes the misfit Axτ∗ − b. In this paper we propose a hybrid algorithm that switches between the two methods in a seamless and lightweight fashion

Paper outline
Notation and definitions
The nonmonotone spectral projected-gradient method
Limited-memory BFGS
Convergence results
Proposed algorithm
Quasi-Newton over a face
Convergence
Application to Lasso
Line search
Wolfe line search conditions
Projection arc
Facial structure
Self-projection cone of a face
Orthogonal basis for a face
Maximum step length along a face
Stopping criteria
Numerical experiments
Active-set type method
Comparison with pnopt
Lasso on sparse problems
Root finding
Coherent test problem generation
Highly coherent measurement matrices
Sparco test problems
Primal-dual gap We now consider the formulation minimize x
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call