Abstract
Millimeter wave (mmwave) communications, envisaged for the next generation wireless networks, rely on large antenna arrays and very narrow, high-gain beams. This poses significant challenges to beam alignment between transmitter and receiver, which has attracted considerable research attention. Even when alignment is achieved, the link is subject to beam drift (BD). BD, caused by non-ideal features inherent in practical beams and rapidly changing environments, is referred to as the phenomenon that the center of main-lobe of the used beam deviates from the real dominant channel direction, which further deteriorates the system’s performance. To mitigate the BD effect, in this paper we first theoretically analyze the BD effect on the performance of outage probability as well as effective achievable rate, which takes practical factors (e.g., the rate of change of the environment, beam width, transmit power) into account. Then, different from conventional practice, we propose a novel design philosophy where multi-resolution beams with varying beam widths are used for data transmission while narrow beams are employed for beam training. Finally, we design an efficient learning based algorithm which can adaptively choose an appropriate beam width according to the environment. Simulation results demonstrate the effectiveness and superiority of our proposals.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.