Abstract

It is widely recognized that orthogonal frequency division multiplexing (OFDM) signals are very prone to nonlinear distortion effects, which can lead to significant performance degradation. However, recent results have showed that nonlinear distortion effects do not necessarily mean performance degradation and can actually lead to performance improvements relative to conventional linear OFDM schemes. In this paper, we consider the effects of bandpass memoryless nonlinear devices on OFDM signals and study the optimum asymptotic performance for both nondispersive channels and severely time-dispersive channels. We present analytical methods for obtaining the Euclidean distance between two OFDM signals that are subjected to different nonlinear characteristics. These results are then employed for obtaining the average asymptotic gain relative to conventional linear OFDM schemes in nondispersive channels and the gain distribution for severely time-dispersive channels with Rayleigh-distributed fading on the different multipath components. Our analytical results, which are shown to be very accurate, indicate that the optimum detection of OFDM schemes with strong nonlinear distortion effects allows significant gains when compared with conventional linear OFDM schemes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call