Abstract

The millimeter-wave frequencies planned for 6G systems present challenges for channel modeling. At these frequencies, surface roughness affects wave propagation and causes severe attenuation of millimeter-wave (mmWave) signals. In general, beamforming techniques compensate for this problem. Analog beamforming has some major advantages over its counterpart, digital beamforming, because it uses low-cost phase shifters for massive MIMO systems compared to digital beamforming that provides more accurate and faster results in determining user signals. However, digital beamforming suffers from high complexity and expensive design, making it unsuitable for mmWave systems. The techniques proposed so far for analog beamforming are often challenging in practice. In this work, we have proposed a deep learning model for analog beams training that helps predict the optimal beam vector. Our model uses an available dataset of 18 base stations, over 1 million users, 60 GHz frequency. The training process first applies a stacked autoencoder to extract the features from the training datasets, and then uses a multilayer perceptron (MLP) to train and predict the optimal beams. Then, the results are evaluated by computing the mean squared error between the expected and predicted beams using the test set. The results show high efficiency compared to the benchmark method, which uses only the MLP for the training process.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.