Abstract

This chapter provides an overview of problem of inference. It reviews the basis of statistical inference and some testing hypotheses. The real inference problem starts when two things are given: an observed value of x and a function of two variables pn (x | θ), viz., the distribution of x depending on the parameter θ. The chapter presents some global statements on parameters. It considers confidence limits or confidence intervals. The method of confidence intervals can be extended to the case of k chance variables and l parameters. The most restricted form of inference from an observed x on an unknown parameter θ is the estimation of θ. The maximum likelihood estimate is the value of θ, which for an observed x, has the greatest chance of being correct or the greatest chance density, under the assumption that a priori chance is equal for all θ. Moreover, p(x | θ) is not a probability or a chance in θ; it is a probability or a chance in x. The chapter discusses consistency of maximum likelihood estimates. An estimate t(x) = t(x1, x2, …, xn) of a theoretical parameter θ is called consistent if for all Є the probability that | t — θ | < Є tends toward one as n ↠ ↠. The chapter also discusses some other definitions connected with the estimation problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call