Abstract

LTE is a candidate wide area communications network for the Smart Grid and can enable applications such as AMI, Demand Response and WAMS. We compare the uplink performance of the LTE FDD and TDD modes for a typical Smart Grid scenario involving a large number of devices sending small to medium size packets to understand the advantages and disadvantages of these two modes. An OPNET simulation model is employed to facilitate realistic comparisons based upon latency and channel utilization. We demonstrate that there is a critical packet size above which there is a step increase in uplink latency due to the nature of the LTE uplink resource scheduling process. It is shown that FDD leads to better uplink performance in terms of latency, while TDD can provide greater flexibility when the split between uplink and downlink data is asymmetrical (as it is expected to be in a Smart Grid environment). It is also demonstrated that the capacity of both FDD and TDD systems in terms of the number of serviced devices is control channel (PDCCH) limited for small infrequent packets, but TDD has the advantage that the capacity remains data channel (PUSCH) limited for smaller packet sizes and lower data burst rates than an FDD system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.