Abstract

We consider the rate entropy function, which is the infimum of average mutual information with conditional entropy distortion constraint. Conditional entropy distortion is actually the expected distortion of logarithmic loss. It can be regarded a generalized distortion since it does not satisfy non-negativity. Like rate distortion function, the rate entropy function is essentially a variation problem with no general solution. So we propose three methods to construct a particular solution. Accordingly, closed-form expressions of rate entropy function for the uniform source and several additive sources are derived. Particularly, neither squared error nor magnitude error distortion exists for the Cauchy source due to the absence of first-order and higher-order moments, but the entropy distortion does. Entropy distortion varies depending on the source. For Gaussian sources, entropy distortion is equivalent to squared error distortion. In the case of vector Gaussian sources, entropy distortion is equivalent to the determinant of error covariance matrix, which is different from both covariance matrix distortion and trace distortion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call