The 20th century began on an auspicious statistical note with the publication of Karl Pearson's (Philos. Mag. Ser. 50 (1900) 157) goodness-of-fit test, which is regarded as one of the most important scientific breakthroughs. The basic motivation behind this test was to see whether an assumed probability model adequately described the data at hand. Pearson (Philos. Trans. Roy. Soc. London Ser. A 185 (1894) 71) also introduced a formal approach to statistical estimation through his method of moments (MM) estimation. Ronald A. Fisher, while he was a third year undergraduate at the Gonville and Caius College, Cambridge, suggested the maximum likelihood estimation (MLE) procedure as an alternative to Pearson's MM approach. In 1922 Fisher published a monumental paper that introduced such basic concepts as consistency, efficiency, sufficiency—and even the term “parameter” with its present meaning. Fisher (Philos. Trans. Roy. Soc. London Ser. A 222 (1922) 309) provided the analytical foundation of MLE and studied its efficiency relative to the MM estimator. Fisher (J. Roy. Statist. Soc. 87 (1924a) 442) established the asymptotic equivalence of minimum χ 2 and ML estimators and wrote in favor of using minimum χ 2 method rather than Pearson's MM approach. Recently, econometricians have found working under assumed likelihood functions restrictive, and have suggested using a generalized version of Pearson's MM approach, commonly known as the GMM estimation procedure as advocated in Hansen (Econometrica 50 (1982) 1029). Earlier, Godambe (Ann. Math. Statist. 31 (1960) 1208) and Durbin (J. Roy. Statist. Soc. Ser. B 22 (1960) 139) developed the estimating function (EF) approach to estimation that has been proven very useful for many statistical models. A fundamental result is that score is the optimum EF. Ferguson (Ann. Math. Statist. 29 (1958) 1046) considered an approach very similar to GMM and showed that estimation based on the Pearson χ 2 statistic is equivalent to efficient GMM. Golan et al. (Maximum Entropy Econometrics: Robust Estimation with Limited Data. Wiley, New York, 1996) developed entropy-based formulation that allowed them to solve a wide range of estimation and inference problems in econometrics. More recently, Imbens et al. (Econometrica 66 (1998) 333), Kitamura and Stutzer (Econometrica 65 (1997) 861) and Mittelhammer et al. (Econometric Foundations. Cambridge University Press, Cambridge, 2000) put GMM within the framework of empirical likelihood (EL) and maximum entropy (ME) estimation. It can be shown that many of these estimation techniques can be obtained as special cases of minimizing Cressie and Read (J. Roy. Statist. Soc. Ser. B 46 (1984) 440) power divergence criterion that comes directly from the Pearson (1900) χ 2 statistic. In this way we are able to assimilate a number of seemingly unrelated estimation techniques into a unified framework.