Abstract

Millimetres wave (mm-wave) is an attractive option for high data rate applications in the 5G wireless communication systems that require proper beamforming and channel tracking. In this paper, we study, analyse and compare the performance of two closely related stochastic gradient descent-based approaches, namely the least mean square (LMS) algorithm, and the normalized least mean square (NLMS) algorithm, for tracking the transmit array beam in addition to the channel status. These adaptive filters usually result in a trade-off between convergence and accuracy. We found that the quality of the tracking results, measured in mean squared error (MSE) sense, are heavily dependent on the present step-the size of the gradient descent.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.