Abstract

In the past decade, the new concept of coherent risk measure has found many applications in finance, insurance and operations research. In this paper, we introduce a new class of coherent risk measures constructed by using information-type pseudo-distances that generalize the Kullback-Leibler divergence, also known as the relative entropy. We first analyze the primal and dual representations of this class. We then study entropic value-at-risk (EVaR) which is the member of this class associated with relative entropy. We also show that conditional value-at-risk (CVaR), which is the most popular coherent risk measure, belongs to this class and is a lower bound for EVaR.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.