Abstract

Sparse impulse responses are encountered in many applications (network and acoustic echo cancellation, feedback cancellation in hearing aids, etc). Recently, a class of exponentiated gradient (EG) algorithms has been proposed. One of the algorithms belonging to this class, the so-called EG± algorithm, converges and tracks much better than the classical stochastic gradient, or LMS, algorithm for sparse impulse responses. We analyze the EG± and EG with unnormalized positive and negative weights (EGU±) algorithms and show when to expect them to behave like the LMS algorithm. We propose different normalized versions with respect to the input signal. It is shown that the proportionate normalized LMS (PNLMS) algorithm proposed by Duttweiler in the context of network echo cancellation (where the system impulse response is often sparse) is an approximation of the EG±, so that we can expect the two algorithms to have similar behavior. Finally, we demonstrate how the concept of exponentiated gradient could be used for blind multichannel identification and propose the multichannel EG± algorithm.KeywordsImpulse ResponseAdaptive AlgorithmChannel Impulse ResponseNatural GradientInitial ConvergenceThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call