Abstract

This paper presents an algorithm, named adaptive projected subgradient method that can minimize asymptotically a certain sequence of nonnegative convex functions over a closed convex set in a real Hilbert space. The proposed algorithm is a natural extension of the Polyak's subgradient algorithm, for nonsmooth convex optimization problem with a fixed target value, to the case where the convex objective itself keeps changing in the whole process. The main theorem, showing the strong convergence of the algorithm as well as the asymptotic optimality of the sequence generated by the algorithm, can serve as a unified guiding principle of a wide range of set theoretic adaptive filtering schemes for nonstationary random processes. These include not only the existing adaptive filtering techniques; e.g., NLMS, Projected NLMS, Constrained NLMS, APA, and Adaptive parallel outer projection algorithm etc., but also new techniques; e.g., Adaptive parallel min-max projection algorithm, and their embedded constraint versions. Numerical examples show that the proposed techniques are well-suited for robust adaptive signal processing problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call