Abstract

The existing data rate must be greatly increased in order to support the numerous applications of the next generation communication systems. Using Large Intelligent Surfaces (LIS), which are a panel with mounted reflective components, is one way to address this problem. Their primary function is to divert the electromagnetic signal to the intended user. As a result, the received signal's strength and reception quality both improve, improving the Quality of Service (QoS). Machine Learning algorithms have been used to implement LIS in a number of ways, including channel estimation and the calculation of phase shifts (discrete), to mention a few. Here, it is suggested that the Signal to Noise Ratio (SNR) for a LIS-supported communication system be assessed using a Deep Learning (DL) model. On the DL model, the effects of the SGD, Adam RMSProp Adadelta and Adamax optimizers are investigated. The Mean Squared Error (MSE) loss function is taken into account. The Adam optimizer offers the highest level of precision, making it preferable to other optimizers. The SNR for 10 users is measured using the suggested DL model with Adam Optimizer. The outcomes of this work is contrasted with the SNR estimate by an existing technique that calculates SNR based on LIS size and the location of transmitter and receiver with an accuracy of 93.5%. The simulation results indicate that the accuracy is increased to 96% using the suggested DL model which attains an average error of 3.6.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call