Abstract
The classic formula for estimating the binomial probability as the proportion of successes contradicts common sense for extreme probabilities when the event never occurs or occurs every time. Laplace’s law of succession estimator, one of the first applications of Bayesian statistics, has been around for over 250 years and resolves the paradoxes, although rarely discussed in modern statistics texts. This work aims to introduce a new theory for exact optimal statistical inference using Laplace’s law of succession estimator as a motivating example. We prove that this estimator may be viewed from a different theoretical perspective as the limit point of the short confidence interval on the double-log scale when the confidence level approaches zero. This motivating example paves the road to the definition of an estimator as the inflection point on the cumulative distribution function as a function of the parameter given the observed statistic. This estimator has the maximum infinitesimal probability of the coverage of the unknown parameter and, therefore, is called the maximum concentration (MC) estimator as a part of a more general M-statistics theory. The new theory is illustrated with exact optimal confidence intervals for the normal standard deviation and the respective MC estimators.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have