Abstract

The entropy power inequality (EPI) for convolution of two independent random variables was first proposed by Shannon, C. E., 1948. However, in practice, there are many situations in which the involved random variables are not independent. In this article, considering additive noise channels, it is shown that, under some conditions, EPI holds for the case when the involved random variables are dependent. In order to achieve our main result, meanwhile a lower bound for the Fisher information of the output signal is obtained which is useful on its own. An example is also provided to illustrate our result.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call