Abstract
AbstractJensen's inequality is ubiquitous in measure and probability theory, statistics, machine learning, information theory and many other areas of mathematics and data science. It states that, for any convex function defined on a convex domain and any random variable taking values in , . In this paper, sharp upper and lower bounds on , termed ‘graph convex hull bounds’, are derived for arbitrary functions on arbitrary domains , thereby extensively generalizing Jensen's inequality. The derivation of these bounds necessitates the investigation of the convex hull of the graph of , which can be challenging for complex functions. On the other hand, once these inequalities are established, they hold, just like Jensen's inequality, for any ‐valued random variable . Therefore, these bounds are of particular interest in cases where is relatively simple and is complicated or unknown. Both finite‐ and infinite‐dimensional domains and codomains of are covered as well as analogous bounds for conditional expectations and Markov operators.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.