Abstract

Abstract Previous works on orthogonal frequency division multiple access (OFDMA) systems with quantized channel state information (CSI) were mainly based on suboptimal quantization methods. In this paper, we consider the performance limit of OFDMA systems with quantized CSI over independent Rayleigh fading channels using the rate-distortion theory. First, we establish a lower bound on the capacity of the feedback channel and build the test channel that achieves this lower bound. Then, with the derived test channel, we characterize the system performance with the outage throughput and formulate the outage throughput maximization problem with quantized channel state information (CSI). To solve this problem in low complexity, we develop a suboptimal algorithm that performs resource allocation in two steps: subcarrier allocation and power allocation. Using this approach, we can numerically evaluate the outage throughput in terms of feedback rate. Numerical results show that this suboptimal algorithm can provide a near optimal performance (with a performance loss of less than 5%) and the outage throughput with a limited feedback rate can be close to that with perfect CSI.

Highlights

  • Orthogonal frequency division multiplexing (OFDM) is a promising technique for the next-generation wireless communication systems

  • It implies that the channel state information (CSI) of users should be known to the base station (BS)

  • We derived the rate-distortion function (RDF) for the downlink CSI. This RDF gives a lower bound on the capacity of the feedback channel according to the rate-distortion theory

Read more

Summary

Introduction

Orthogonal frequency division multiplexing (OFDM) is a promising technique for the next-generation wireless communication systems. The author considered a minimum square error channel prediction scheme to overcome the detrimental effect of feedback delay and proposed resource allocation algorithms to maximize the downlink throughput. The authors in [18] proposed OFDMA throughput maximization algorithm under the assumption that quantization for CSI feedback is optimized in terms of the rate-distortion theory point of view. By the rate-distortion theory [2], this RDF gives a minimum number of bits for the index Ik that can describe the channel power gain Ak without exceeding the mean quantization error Dk. The RDF of Ak is given by the following theorem: Theorem 1. In downlink throughput maximization with imperfect CSI, we require the probability density function of the actual power gain conditioned on the quantized power gain. 0, which implies that no CSI is fed back to the BS

Outage throughput maximization with quantized CSI
Power allocation
Findings
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.