Abstract

Because data traffic is growing at a rapid pace thanks to advancements in the Internet of Things, precise modelling and precisely anticipating Long-Term Evolution (LTE) Channel is critical for a variety of applications like as video streaming, effective bandwidth consumption, and power management. In this research, we propose a model based on a Computational Intelligence (CI) Algorithm that may enhance Channel Estimation based on received signal. Two Algorithms are considered. In contrast to previous work that focused solely on designing models to estimate channel using traditional Minimum Mean Square Error (MMSE) and Least Square (LS) algorithms, we used 1) GA (Genetic Algorithm) and 2) PSO (Particle Swarm Optimization Algorithm) to work on Discrete and Continuous Long-Term Evolution (LTE) drive test data. We're looking at LTE in the 5.8 GHz band in particular. By lowering the mean square error of LS and the complexity of MMSE, the design model attempts to improve channel estimation. Pilots are put at random and sent with data to gather channel information, which aids the receiver in decoding and estimating the channel using LS, MMSE, Taguchi GA, and PSO. The Bit Error Rate (BER), Signal to Noise Ratio, and Mean Square Error of a CI-based model have all been estimated. In comparison to the MMSE and LS algorithms, the proposed model BER achieves the target gain of 2.4 dB and 5.4 dB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call