Abstract

Recently much research work has focused on employing deep learning (DL) algorithms to perform channel estimation in the upcoming 6G communication systems. However, these DL algorithms are usually computationally demanding and require a large number of training samples. Hence, this work investigates the feasibility of designing efficient machine learning (ML) algorithms that can effectively estimate and track time-varying, frequency-selective channels. The proposed algorithm is integrated with orthogonal frequency-division multiplexing (OFDM) to eliminate intersymbol interference (ISI) induced by the frequency-selective multipath channel and compared with the well-known least square (LS) and linear minimum mean square error (LMMSE) channel estimation algorithms. The obtained results have demonstrated that even when a small number of pilot samples, <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$N_{P}$ </tex-math></inline-formula> , is inserted before the <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$N$ </tex-math></inline-formula> subcarriers OFDM symbol, the introduced ML-based channel estimation is superior to the LS and LMMSE algorithms. This dominance is reflected in the bit-error-rate (BER) performance of the proposed algorithm, which attains a gain of 2.5 dB and 5.5 dB over the LMMSE and LS algorithms, respectively, when <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$N_{P}=\frac {N}{8}$ </tex-math></inline-formula> . Furthermore, the BER performance of the proposed algorithm is shown to degrade by only 0.2 dB when the maximum Doppler frequency is randomly varied. Finally, the number of iterations required by the proposed algorithm to converge to the smallest achievable mean-squared error (MSE) are thoroughly examined for various signal-to-noise ratio (SNR) levels.

Highlights

  • The ever-increasing throughput requirement by modern communication systems has heightened the need for new practical and intelligent signal processing schemes [1]

  • We considered an uncoded orthogonal frequency-division multiplexing (OFDM)-based system that receives information over time-varying frequency selective channel

  • The proposed algorithm was utilized at this OFDM system to estimate this time-varying channel and effectively track its change from frame to frame

Read more

Summary

INTRODUCTION

The ever-increasing throughput requirement by modern communication systems has heightened the need for new practical and intelligent signal processing schemes [1]. In [9], the authors presented a DL-based orthogonal frequency-division multiplexing (OFDM) receiver that exploits DL techniques to obtain the channel state information (CSI) and detect the transmitted symbols. He et al [10] considered a mmWave massive MIMO system that utilized a convolutional neural network (CNN) to estimate the channel matrix, H. DL algorithms have high computational requirements and need a large amount of data for training This fact has motivated us to investigate the feasibility of introducing a computationally efficient ML algorithm with a low complexity that is specially designed for channel estimation.

SYSTEM ARCHITECTURE TRANSMITTED SIGNAL
MACHINE LEARNING CHANNEL ESTIMATION
COMPLEXITY ANALYSIS
RESULTS
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call