Abstract

To address the unknown nature of probability-sampling models, in this paper we use information theoretic concepts and the Cressie-Read (CR) family of information divergence measures to produce a flexible family of probability distributions, likelihood functions, estimators, and inference procedures. The usual case in statistical modeling is that the noisy indirect data are observed and known and the sampling model-error distribution-probability space, consistent with the data, is unknown. To address the unknown sampling process underlying the data, we consider a convex combination of two or more estimators derived from members of the flexible CR family of divergence measures and optimize that combination to select an estimator that minimizes expected quadratic loss. Sampling experiments are used to illustrate the finite sample properties of the resulting estimator and the nature of the recovered sampling distribution.

Highlights

  • Uncertainty regarding statistical models and associated estimating equations and the data samplingprobability distribution function create unsolved problems as they relate to information recovery. likelihood is a common loss function used in fitting statistical models, the optimality of a given likelihood method is fragile inference-wise under model uncertainty

  • In line with the complex nature of the problem, in the sections to follow, we demonstrate a convex estimation rule, which seeks to choose among Minimum Power Divergence (MPD)-type estimators to minimize quadratic risk (QR)

  • This was achieved by taking a convex combination of estimators associated with two members of the CR family, under minimum expected quadratic loss

Read more

Summary

Introduction

Uncertainty regarding statistical models and associated estimating equations and the data samplingprobability distribution function create unsolved problems as they relate to information recovery. In addition the precise functional representation of the data sampling process cannot usually be justified from physical or behavioral theory Given this situation, a natural solution is to use estimation and inference methods that are designed to deal with systems that are fundamentally stochastic and where uncertainty and random behavior are basic to information recovery. A natural solution is to use estimation and inference methods that are designed to deal with systems that are fundamentally stochastic and where uncertainty and random behavior are basic to information recovery In this context [1,2], the family of likelihood functionals permits the researcher to face the resulting stochastic inverse problem and exploit the statistical machinery of information theory to gain insights relative to the underlying causal behavior from a sample of data.

Minimum Power Divergence
The CR Family and Minimum Power Divergence Estimation
Relating Minimum Power Divergence to Maximum Likelihood
Identifying the Probability Space
Distance–Divergence Measures
The Case of Two CR Alternatives
Empirical Calculation of
Finite Sample Performance
Concluding Remarks

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.