Abstract

Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics. Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates models based on the performance of the optimal strategies that the models suggest. They interpreted their performance measures in information theoretic terms and provided new generalizations of Shannon entropy and Kullback–Leibler relative entropy and called them U-entropy and U-relative entropy. In this article, a utility-based criterion for independence of two random variables is defined. Then, Markov’s inequality for probabilities is extended from the U-entropy viewpoint. Moreover, a lower bound for the U-relative entropy is obtained. Finally, a link between conditional U-entropy and conditional Renyi entropy is derived.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call