Abstract

We propose a block successive convex approximation algorithm for large-scale nonsmooth nonconvex optimization problems. It is suitable for problems where the dimension exceeds the memory and/or the processing capability of the existing hardware. The proposed algorithm partitions the whole set of variables into blocks which are updated sequentially. At each iteration, a particular block variable is selected and updated by solving an approximation subproblem with respect to that block variable only. The proposed algorithm has several attractive features, namely, i) high flexibility, as the approximation function only needs to be strictly convex and it does not have to be a global upper bound of the original function; ii) fast convergence, as the approximation function can be designed to exploit the problem structure at hand and the stepsize is calculated by the line search; iii) low complexity, as the line search scheme is carried out over a properly constructed differentiable function; iv) guaranteed convergence to a stationary point, even when the objective function does not have a Lipschitz continuous gradient. These features are illustrated by an application in network anomaly detection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call