Abstract

Abstract In this paper, we present a new refinement of Jensen’s inequality with applications in information theory. The refinement of Jensen’s inequality is obtained based on the general functional in the work of Popescu et al. As the applications in information theory, we provide new tighter bounds for Shannon’s entropy and some f-divergences.

Highlights

  • Let C be a convex subset of the linear space X and f a convex function on C

  • We propose and prove new tighter upper bounds for Shannon’s entropy compared to the bound given in [4]

  • Since each partition from m+1 is a refinement of a partition from m, the result follows

Read more

Summary

Introduction

Let C be a convex subset of the linear space X and f a convex function on C. A new refinement of Jensen’s inequality with applications in information theory 1749. S. Dragomir) Let C be a convex subset in the real linear space X and assume that f : C → is a convex function on C. In 2016, Popescu et al defined a new refined functional as follows [4]: D( f , p, x; J, J1 , J2,..., Jm) m. G. Popescu et al.) Let C be a convex subset in the real linear space X and assume that f : C → is a convex function on C. Horváth et al [6] presented new upper bounds for the Shannon entropy (see Corollary 1) and defined an extended f-divergence functional (see Definition 2) by applying a cyclic refinement of Jensen’s inequality. We obtain new bounds for some f-divergences better than the bounds given in [3]

General inequalities by generalization
New upper bounds for Shannon’s entropy
New lower bounds for f-divergence measures
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call