Abstract

In this paper, a nonsmooth optimization method for locally Lipschitz functions on real algebraic varieties is developed. To this end, the set-valued map $\varepsilon$-conditional subdifferential $x\to \partial_{\varepsilon}^{N} f(x):= \partial_{\varepsilon}f(x)+N(x)$ is introduced, where $\partial_{\varepsilon}f(x)$ is the Goldstein-$\varepsilon$-subdifferential and $N(x)$ is a closed convex cone at $x$. It is proved that negative of the shortest $\varepsilon$-conditional subgradient provides a descent direction in $T(x)$, which denotes the polar of $N(x)$. The $\varepsilon$-conditional subdifferential at an iterate $x_{\ell}$ can be approximated by a convex hull of a finite set of projected gradients at sampling points in $x_\ell+\varepsilon_{\ell} B_{T(x_{\ell})}(0,1)$ to $T(x_{\ell})$, where $T(x_{\ell})$ is a linear space in the Bouligand tangent cone and $ B_{T(x_{\ell})}(0,1)$ denotes the unit ball in $T(x_{\ell})$. The negative of the shortest vector in this convex hull is shown to be a descent direction in the Bouligand tangent cone at $x_{\ell}$. The proposed algorithm makes a step along this descent direction with a certain step-size rule, followed by a retraction to lift back to points on the algebraic variety $\mathcal{M}$. The convergence of the resulting algorithm to a critical point is proved. For numerical illustration, the considered method is applied to some nonsmooth problems on varieties of low-rank matrices $\mathcal{M}_{\leq r}$ of real $M\times N$ matrices of rank at most $r$, specifically robust low-rank matrix approximation and recovery in the presence of outliers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call