Abstract

Decision theory studies the logic and the mathematical properties of decision making under uncertainty. Statistical decision theory focuses on the investigation of decision making when uncertainty can be reduced by information acquired through experimentation. This article reviews the Bayesian approach to statistical decision theory, as was developed from the seminal ideas of Savage. Specifically, it considers: the principle of maximation of expected utility and its axiomatic foundations; the basic elements of Bayesian statistical decision theory, illustrated using standard statistical decision problems; measuring the value of information in decision making and the concept of admissibility.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call