Abstract

Abstract In this study, we work with the relative divergence of type s , s ∈ ℝ s,s\in {\mathbb{R}} , which includes the Kullback-Leibler divergence and the Hellinger and χ 2 distances as particular cases. We study the symmetrized divergences in additive and multiplicative forms. Some basic properties such as symmetry, monotonicity and log-convexity are established. An important result from the convexity theory is also proved.

Highlights

  • Denote byΩ+ = {μ = {μi} μi > 0, ∑ μi = 1}, the family of finite discrete probability distributions

  • Our aim in this study is to investigate some global properties of the symmetrized measures Us = Us(μ, ν) = Us(ν, μ) ≔ Ks(μ||ν) + Ks(ν||μ) and Vs = Vs(μ, ν) = Vs(ν, μ) ≔ Ks(μ||ν)Ks(ν||μ)

  • Vs is monotone decreasing for s ∈ (−∞, 1/2) and monotone increasing for s ∈ (1/2, +∞). 5

Read more

Summary

Introduction

The famous Csiszár’s f-divergence Cf(μ||ν) [1] is known as the most general probability measure in the Information Theory. For a convex function f : (0, ∞) → , the f-divergence measure is defined as where μ, ν ∈ Ω +. \ {0, 1}; K (μ||ν), s = 1, where {μi}1n, {νi}1n are given probability distributions and K(μ||ν) is the Kullback-Leibler divergence It includes the Hellinger and χ2 distances, as particular cases. Main properties of this measure are given in [3].

Results and proofs
Note that
Applications
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.