Abstract
LTE (Long Term Evolution) is a 3GPP (Third Generation Partnership Project) wireless standards which uses the standard OFDMA (Orthogonal Frequency Division Multiple Access) modulation, MU-MIMO (Multiuser Multiple Input Multiple Output) technology and different multipath fading models. LTE allows the operator to use spectrum more efficiently to deliver high speed data. This paper characterizes the downlink performance of LTE. The requirement for high data rate applications demanded a system to provide users with the MIMO technology which constitutes a breakthrough in wireless communication and is defined in the LTE standard. There are many metric to characterize the performance, but one of the most convenient and informative metric is the BER (Bit Error Rate). So the performance is characterized in terms of BER. In this paper the LTE system is modeled and simulated using MATLAB and the BER for 2×2 and 4×4 MIMO-LTE using 16QAM and 64QAM modulation schemes for Rayleigh fading environment are obtained against different SNR values.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IOSR Journal of Electronics and Communication Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.