Abstract

Entropy is a measure of self-information which is used to quantify information losses. Entropy was developed in thermodynamics, but is also used to compare probabilities based on their deviating information content. Corresponding model uncertainty is of particular interest and importance in stochastic programming and its applications like mathematical finance, as complete information is not accessible or manageable in general.This paper extends and generalizes the Entropic Value-at-Risk by involving Rényi entropies. We provide explicit relations among different entropic risk measures, we elaborate their dual representations and present their relations explicitly.We consider the largest spaces which allow studying the impact of information in detail and it is demonstrated that these do not depend on the information loss. The dual norms and Hahn–Banach functionals are characterized explicitly.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.