Abstract

Although statements to contrary are often made, application of principle of least squares is not limited to situations in which p is normally distributed. The GaussMarkov theorem is to effect that, among unbiased estimates which are linear functions of observations, those yielded by least squares have minimum variance, and independence of this property from any assumption regarding form of distribution is just one of striking characteristics of principle of least squares. The principle of maximum likelihood, on other hand, requires for its application a knowledge of probability distribution of p. Under this principle one estimates parameters a, , so that, were estimates true values, probability of total set of observations of p would be maximum. This principle has great intuitive appeal, is probably oldest existing rule of estimate, and has been widely used in practical applications under name of the most probable value. If pi's are normally distributed about P1 with or0 independent of Pi, principle of maximum likelihood yields same estimate as does least squares, and Gauss is said to have derived least squares from this application. In recent years, principle of maximum likelihood has been strongly advanced under influence of teachings of Sir Ronald Fisher, who in a renowned paper of 1922 and in later writings [1] outlined a comprehensive and unified system of mathematical statistics as well as a philosophy of statistical inference that has had profound and wide development. Neyman [2] in a fundamental paper in 1949 defined a family of estimates,

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call