Abstract

AbstractWe propose and investigate a new estimation method for the parameters of models consisting of smooth density functions on the positive half axis. The procedure is based on a recently introduced characterization result for the respective probability distributions, and is to be classified as a minimum distance estimator, incorporating as a distance function the Lq‐norm. Throughout, we deal rigorously with issues of existence and measurability of these implicitly defined estimators. Moreover, we provide consistency results in a common asymptotic setting, and compare our new method with classical estimators for the exponential, the Rayleigh and the Burr Type XII distribution in Monte Carlo simulation studies. We also assess the performance of different estimators for non‐normalized models in the context of an exponential‐polynomial family.

Highlights

  • One of the most classical problems in statistics is the estimation of the parameter vector of a parametrized family of probability distributions

  • It presents itself in a significant share of applications because parametric models often contribute a reasonable compromise between flexibility in the shape of the statistical model and meaningfulness of the conclusions that can be drawn from the model

  • A second class of methods incorporates the idea of using as an estimator the value that minimizes some goodness-of-fit measure. To implement this type of estimators, the empirical distribution, quantile or characteristic function is compared to its theoretical counterpart from the underlying parametric model in a suitable distance, and the term is minimized over the parameter space, see Wolfowitz (1957), or Parr (1981) for an early bibliography

Read more

Summary

INTRODUCTION

One of the most classical problems in statistics is the estimation of the parameter vector of a parametrized family of probability distributions. A second class of methods incorporates the idea of using as an estimator the value that minimizes some goodness-of-fit measure To implement this type of estimators, the empirical distribution, quantile or characteristic function is compared to its theoretical counterpart from the underlying parametric model in a suitable distance, and the term is minimized over the parameter space, see Wolfowitz (1957), or Parr (1981) for an early bibliography. Later on we discuss noise-contrastive estimation, a concept introduced by Gutmann & Hyvärinen (2010) All these references indicate that statistical inference for non-normalized models is a topic of very recent investigation that interests researcher in machine learning, a fact which we further allude to at the end of the following section.

THE NEW ESTIMATORS
EXISTENCE AND MEASURABILITY
CONSISTENCY
EXAMPLE
10. NOTES AND COMMENTS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call