Abstract

The behavior of orthogonal frequency division multiplexing (OFDM) signals in bandpass nonlinearity is presented. In particular, in-band bit-error-rate (BER) degradation and induced adjacent channel interference, as a result of amplitude limiting or clipping, are analyzed. In the presence of both nonlinear distortion and additive Gaussian noise, optimized output power back-off is provided to balance the requirements of the minimum BER and the tolerance of adjacent channel interference for a given OFDM system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call