Abstract

In this paper, we present a new approach for block LMS algorithm using variable block length, which provides improved convergence speed while maintaining the misadjustment of conventional LMS algorithm. The block length increases or decreases as the averaged square error during the block length increases or decreases. And we derive the optimum initial step size under the white Gaussian input signal environment. This optimum step size is used to prove the performance of the proposed algorithm. Simulation results which comparing the proposed method to the block LMS algorithm and the normalized LMS algorithm indicate its better performance on convergence rate and misadjustment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call