Abstract

We propose a subspace-accelerated Bregman method for the linearly constrained minimization of functions of the form$ f(\mathbf u) + \tau_1\,\|\mathbf u\|_1 + \tau_2\,\|D\,\mathbf u\|_1 $, where $f$ is a smooth convex function and $D$ represents a linear operator,e.g., a finite difference operator, as in anisotropic total variation and fused lasso regularizations. Problems of this type arise in a widevariety of applications, including portfolio optimization, learning of predictive models from functional magnetic resonance imaging (fMRI) data, and source detection problems in electroencephalography. The use of $\|D\,\mathbf u\|_1$is aimed at encouraging structured sparsity in the solution. The subspaces where the acceleration is performed are selected so that the restrictionof the objective function is a smooth function in a neighborhood of the current iterate. Numerical experiments for multi-period portfolioselection problems using real data sets show the effectiveness of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.