Abstract

This paper studies classes of generic deterministic, discrete time, memoryless, and “nonlinear” additive white Gaussian noise (AWGN) channels. Subject to multiple types of constraints such as the even-moment and compact-support constraints or a mixture, the optimal input is proved to be discrete with finite number of mass points in the vast majority of the cases. Only under the even-moment constraint and for special cases that emulate the average power constrained linear channel, capacity is found to be achieved by an absolutely continuous input. The results are extended to channels where the distortion is generally piecewise nonlinear where the discrete nature of the optimal input is conserved. These results are reached through the development of methodology and tools that are based on standard decompositions in a Hilbert space with the Hermite polynomials as a basis, and it is showcased how these bases are natural candidates for general information-theoretic studies of the capacity of channels affected by AWGN. Intermediately, novel results regarding the output rate of decay of Gaussian channels are derived. Namely, the output probability distribution of any channel subjected to additive Gaussian noise decays necessarily “slower” than the Gaussian itself. Finally, numerical computations are provided for some sample cases, optimal inputs are determined, and capacity curves are drawn. These results put into question the accuracy of adopting the widely used expression 1(1+ SNR) for computing capacities of Gaussian deterministic channels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call