Abstract

Convergence properties are studied for a class of gradient-based adaptive filters known as order statistic least mean square (OSLMS) algorithms. These algorithms apply an order statistic filtering operation to the gradient estimate of the standard least mean square (LMS) algorithm. The order statistic operation in OSLMS algorithms can reduce the variance of the gradient estimate (relative to LMS) when operating in non-Gaussian noise environments. A consequence is that in steady state, the excess mean square error can be reduced. It is shown that when the input signals are iid and symmetrically distributed, the coefficient estimates for the OSLMS algorithms converge on average to a small area around their optimal values. Simulations provide supporting evidence for algorithm convergence. As a measurement of performance, the mean squared coefficient error of OSLMS algorithms has been evaluated under a range of noise distributions and OS operators. Guidelines for selection of the OS operator are presented based on the expected noise environment. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call