Abstract

In narrow-band adaptive-array applications, the mean-square convergence of the discrete-time real least mean-square (LMS) algorithm is slowed by image-frequency noises generated in the LMS loops. The complex LMS algorithm proposed by Widrow et al. is shown to eliminate these noises, yielding convergence of the mean-squared error (MSE) at slightly over twice the rate. This paper includes a comprehensive analysis of the MSE of adaptation for LMS. The analysis is based upon the method developed in the 1968 dissertation by K. D. Senne, and it represents the most complete treatment of the subject published to date.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.