Abstract
Abstract Fisher's fiducial argument was intended to solve arguably the most fundamental problem in statistics, namely, probabilistic inference on unknown parameters without a prior distribution or Bayes's theorem. Despite Fisher's ingenuity, fiducial inference has, for various reasons, been largely dismissed by the mainstream of statistics, so the problem remains an open one. This article surveys a recently proposed alternative to Fisher's theory, called inferential models (IMs), with the same goal of providing prior‐free probabilistic inference. Similar to fiducial, the IM approach starts by connecting the data and unknown parameters to an unobservable auxiliary variable defined through a particular representation of the sampling model. The IM's chief novelty is the introduction of a random set for predicting the unobserved value of that auxiliary variable. This makes the IM output a belief function, rather than a probability, but an appropriate choice of the random set guarantees that this output is valid which, among other things, implies that interval estimates derived from the IM have provable bounds on the frequentist error rate.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have