Abstract
Minimax approximations have found many applications but are lack of efficient solution algorithms for large-scale problems. Based on the alternating direction method of multipliers (ADMM) for convex optimization, this letter presents an efficient scalarwise algorithm for a regularized minimax approximation problem. The ADMM-based algorithm is then applied in the minimax design of two-dimensional (2-D) digital filters and the training of randomized neural networks for regression on a realworld benchmark dataset. Experimental results demonstrate the fast convergence rate and low computational complexity of the proposed algorithm, as well as the good approximation/prediction performance of the learned approximation model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have