Abstract

Hermite’s interpolation is utilized to establish a new generalization of an inequality for higher order convex functions containing Csiszár divergence on time scales. New entropic bounds in q-calculus and h-discrete calculus are also deduced. Some estimates for Zipf–Mandelbrot entropy are also given.

Highlights

  • In the literature, several kinds of information divergence measures that compare two probability distributions are discussed and are widely applied in engineering, information theory, and statistics

  • Nonparametric measures are used to determine the amount of information provided by data in order to discriminate in favor of one probability distribution p1 over another p2, or to measure the affinity or distance among p1 and p2

  • Measures of entropy express the amount of information contained in a distribution, that is, the amount of uncertainty associated with the outcome of an experiment

Read more

Summary

Introduction

Several kinds of information divergence measures that compare two probability distributions are discussed and are widely applied in engineering, information theory, and statistics. In [58], the authors used Hermite interpolation and obtained Popoviciu-type inequalities for n-convex function. Motivated by the above discussion, we use Hermite’s interpolating polynomial and generalize the Csiszár-type inequality on time scales for n-convex functions.

Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.