Abstract

Model selection has different connotations in Statistics and History or Philosophy of Science. In Statistics, it has a useful but much more pedestrian role of distinguishing between two statistical models on the basis of available data. This chapter presents several examples where model selection techniques can be applied to answer scientific or statistical questions. It considers the Akaike Information Criterion (AIC) in a few canonical statistical problems and state results of its statistical optimality therein. Its connection is also discussed with other model selection criteria and some of the generalizations of it. The optimality is connected with Akaike's original motivation as brought out in but it does not follow as an immediate consequence. One very important problem where AIC can be used as a model selection rule is the problem of nonparametric regression, where the functional form of dependence between the dependent variable and the regressor is not expressible in terms of finitely many unknown parameters. The Bayesian Information Criterion (BIC) is more useful in selecting a correct model while the AIC is more appropriate in finding the best model for predicting future observations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call