Abstract

Jensen’s inequality is one of the fundamental inequalities which has several applications in almost every field of science. In 2003, Mercer gave a variant of Jensen’s inequality which is known as Jensen–Mercer’s inequality. The purpose of this article is to propose new bounds for Csiszár and related divergences by means of Jensen–Mercer’s inequality. Also, we investigate several new bounds for Zipf–Mandelbrot entropy. The idea of this article may further stimulate research on information theory with the help of Jensen–Mercer’s inequality.

Highlights

  • In theory of inequalities, convex functions play an important role. e definition of convex function [1] is as follows.Let φ: [δ, ε] ⊂ R ⟶ R be a function, φ is said to be convex if ∀ w, z ∈ [δ, ε] and 0 ≤ t ≤ 1, the inequality φ(tw +(1 − t)z) ≤ tφ(w) +(1 − t)φ(z), (1)holds, and φ is said to be strictly convex if ∀ w ≠ z and t ∈ (0, 1), (1) holds strictly

  • Holds, and φ is said to be strictly convex if ∀ w ≠ z and t ∈ (0, 1), (1) holds strictly

  • If inequality (1) holds in reversed directions, φ is said to be concave, and φ is said to be strictly concave if the inequality (1) holds strictly in reversed direction ∀ w ≠ z and t ∈ (0, 1)

Read more

Summary

Introduction

Convex functions play an important role. e definition of convex function [1] is as follows. Kullback–Leibler divergence is nonnegative and is zero if and only if θλ cλ It satisfies the two properties of metric, but K(θ, c) ≠ K(c, θ), and does not obey the triangle inequality. We can construct Shannon entropy [37] from Kullback–Leibler divergence, which is given as follows: H(θ). Variational distance is utilized to characterize solid typicality and asymptotic equipartition of sequences generated by sampling from a given distribution [48]. Zipf’s law, known as Zipf–Mandelbrot law, which gave improvement on account of the low-rank words in corpus [58] and is given as follows:. Ey have used two parametric Zipf–Mandelbrot laws instead of different weights in the inequalities for Shannon entropy. We establish bounds for Zipf–Mandelbrot entropy by applying Zipf–Mandelbrot laws instead of probability distributions in Kullback–Leibler and Jefferys divergences. We deduce new estimates for Zipf–Mandelbrot entropy associated to different parametric Zipf–Mandelbrot laws

Bounds for Csiszar and Related Divergences
Bounds for Zipf–Mandelbrot Entropy

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.