Abstract

In some applications, the variance of additive measurement noise depends on the signal that we aim to measure. For instance, additive signal-dependent Gaussian noise (ASDGN) channel models are used in molecular and optical communication. Herein, we provide lower and upper bounds on the capacity of additive signal-dependent noise (ASDN) channels. The first lower bound is based on an extension of majorization inequalities, and the second lower bound utilizes the properties of the differential entropy. The lower bounds are valid for arbitrary ASDN channels. The upper bound is based on a previous idea of the authors (“symmetric relative entropy”) and is applied to the ASDGN channels. These bounds indicate that in the ASDN channels (unlike the classical additive white Gaussian noise channels), the capacity does not necessarily become larger by reducing the noise variance function. We also provide sufficient conditions under which the capacity becomes infinite. This is complemented by some conditions implying that the capacity is finite, and a unique capacity achieving measure exists (in the sense of the output measure).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call