Abstract

Energy-saving for the LTE-A network with relay nodes in TDD mode is addressed in this paper, and integrated sleep scheduling schemes for relay nodes and user devices under a base station are designed. The authors’ two previously proposed ideas, namely, Load-Based Power Saving (LBPS) and Virtual Time, are adopted in the design, and two strategies, namely, top-down and bottom-up each with three LBPS schemes are proposed. In the top-down strategy, the load as well as the channel quality on the backhaul link is first considered to determine the sleep pattern for all relay nodes, and then the sleep schedule for UEs under each relay node is determined accordingly. On the contrary, the load and the channel quality on the access links are first considered and then integrated into the sleep schedule on the backhaul link. Two associated mechanisms for the proposed LBPS schemes to operate in the virtual time domain are also proposed in the paper, i.e., calculation of the virtual subframe capacity and the mapping mechanisms from the virtual time to the actual time. The benefit of the proposed schemes in power saving over the standard-based scheme is demonstrated by the simulation study, and the bottom-up scheme of BU-Split outperforms the other schemes under equally distributed input load as well as the hotspot scenario. Discussion on the tradeoff of the processing overhead and the performance for the proposed schemes is presented in the paper.

Highlights

  • 1.1 Motivation Long-Term Evolution [1], denoted by LTE, and its successor LTE-Advanced [2], denoted by Long-Term Evolution-Advanced (LTE-A), have become the major mobile communications technology and have been rapidly deployed worldwide to provide versatile services and attract more users in recent years

  • UL) and downlink, two modes of duplex transmission are specified in the standard of LTE/LTE-A: frequency division duplex and time division duplex

  • Yang et al EURASIP Journal on Wireless Communications and Networking (2019) 2019:226 Fig. 14 Power saving efficiency (PSE) of TD-Aggr—TD-Aggr PSE simulation results with backhaul Channel Quality Indicator (CQI) = 10 and access CQI = 9

Read more

Summary

Introduction

1.1 Motivation Long-Term Evolution [1], denoted by LTE, and its successor LTE-Advanced [2], denoted by LTE-A, have become the major mobile communications technology and have been rapidly deployed worldwide to provide versatile services and attract more users in recent years. In order to extend the coverage area and provide higher transmission rates for User Equipment (denoted by UE) at the cell edge, the idea of relay node (denoted by RN) was proposed in the standard of LTE-A. Since there are two transmission directions in LTE/LTE-A, namely, uplink (denoted by UL) and downlink (denoted by DL), two modes of duplex transmission are specified in the standard of LTE/LTE-A: frequency division duplex (denoted by FDD) and time division duplex (denoted by TDD). In the mode of FDD, two different and sufficiently separated frequency bands are used for each of the directions respectively. In the mode of TDD, UL transmission and DL transmission share a single frequency band in a time-sharing manner. Different numbers of DL and UL subframes are designated in different configurations in a time period of 10 ms to provide flexibility in resource management for both directions

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call