Abstract

Markov kernels play an important role in probability theory and mathematical statistics, conditional distributions being the main example.A Markov kernel can also be viewed as a nontrivial extension of the concepts of σ-field and statistic. For instance, in statistical decision theory, randomized procedures (also named decision rules or, even, strategies) are Markov kernels, while nonrandomized procedures are statistics. It is well known that, in some situations, the optimum procedure is a randomized one: for example, the fundamental lemma of Neyman and Pearson shows how randomization is necessary to obtain a most powerful test.Using such an approach, we extend to Markov kernels well known concepts of probability theory or mathematical statistics, such as independence, ancillarity and completeness. The reader is referred to Heyer (1982) for the corresponding extension of the concept of sufficiency.Among other results, this paper includes: some characterizations of independence of Markov kernels and the stability of independence after the composition with other Markov kernels; two ways of constructing independent Markov kernels; three examples of independence of Markov kernels in Bayesian Theory, Statistical Decision Theory and Testing of Hypotheses; a new source of examples of sufficient Markov kernels; and some results and examples about the stability properties of the extended concepts. Besides, a counterexample is included to exhibit a situation where a property of completeness for statistics cannot be extended to Markov kernels.As a last application of the obtained results on independence, we extend to Markov kernels two celebrated results of Debabrata Basu in mathematical statistics relating independence, sufficiency, ancillarity and completeness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call