Abstract

Precise modelling and accurate estimation of long-term evolution (LTE) channels are essential for numerous applications like video streaming, efficient use of bandwidth and utilization of power. This deals with the fact that data traffic is increasing continuously with advances in Internet of things. Previous works were focused mainly on designing models to estimate channel using traditional minimum mean square error (MMSE) and least squares (LS) algorithms. The proposed model enhances LTE channel estimation. The designed model combines LS and MMSE methods using Taguchi genetic (GE) and Particle Swarm Intelligence (PSO) algorithms. We consider LTE operating in 5.8 GHz range. Pilot signals are sent randomly along with data to obtain information about the channel. They help to decode a signal in a receiver and estimate LS and MMSE combined with Taguchi GA and PSO, respectively. CI-based model performance was calculated according to the bit error rate (BER), signal-to-noise ratio and mean square error. The proposed model achieved the desired gain of 2.4 dB and 5.4 dB according to BER as compared to MMSE and LS algorithms, respectively.

Highlights

  • long-term evolution (LTE) means Long Term Evolution, and it began as a project in early 2005 by the 3rd Generation Partnership Project, a telecommunications organization

  • The channel response varying from frame to frame, the system must collect a correct Channel Impulse Response (CIR) for every frame

  • This paper introduced a straightforward but an effective computational intelligence-depended approach for channel evaluation

Read more

Summary

Introduction

LTE means Long Term Evolution, and it began as a project in early 2005 by the 3rd Generation Partnership Project, a telecommunications organization. The LTE channel estimation depends on present information that is shared between the sending and LTE receiving side. The efficiency of the channel estimation can be improved by utilizing the semi-blind channel estimating system, which uses pilot and data symbols. This method employs the signal tracked as a feedback mechanism to track the performance of the channel, as well as the tracked signal as reference signal for future data prediction. When we use training symbols, the channel estimation is done using Least Square (LS) and Minimum Mean Square Error (MMSE) algorithms, which improve system performance by lowering the Bit Error Rate (BER) [7, 8]. The random pilot signals are subjected to the optimization technique, which is compared to the fixed pilot signals subjected to the LS algorithm and MMSE algorithms [11, 12]

Related Work
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call