Long-Term Evolution (LTE) is the next generation of current mobile telecommunication networks. LTE has a flat radio-network architecture and significant increase in spectrum efficiency, throughput and user capacity. In this paper, performance analysis of robust channel estimators for Downlink Long Term Evolution-Advanced (DL LTE-A) system using three Artificial Neural Networks: Feed-forward neural network (FFNN), Cascade-forward neural network (CFNN) and Layered Recurrent Neural Network (LRN) are trained separately using Back-Propagation Algorithm and also ANN is trained by Genetic Algorithm (GA). The methods use the information got by the received reference symbols to estimate the total frequency response of the channel in two important phases. In the first phase, the proposed ANN based method learns to adapt to the channel variations, and in the second phase it estimates the channel matrix to improve performance of LTE. The performance of the estimation methods is evaluated by simulations in Vienna LTE-A DL Link Level Simulator in MATLAB software. Performance of the proposed channel estimator, ANN trained by Genetic Algorithm (ANN- GA) is compared with traditional Least Square (LS) algorithm and ANN based other estimator like Feed- forward neural network, Layered Recurrent Neural Network and Cascade-forward neural network for Closed Loop Spatial Multiplexing (CLSM)-Single User Multi-input Multi-output (MIMO-2×2 and 4×4) in terms of throughput. Simulation result shows proposed ANN- GA gives better performance than other ANN based estimations methods and LS. Index Terms — LTE-A, MIMO, Artificial Neural Network (ANN), Back-Propagation, Layered Recurrent Neural Network (LRN), Feed-forward neural network (FFNN), Cascade-forward neural network (CFNN), Genetic Algorithm (GA).
Read full abstract