Abstract

The problem of convergence of moments of a sequence of random variables to the moments of its asymptotic distribution is important in many applications. These include the determination of the optimal training sample size in the cross validation estimation of the generalization error of computer algorithms, and in the construction of graphical methods for studying dependence patterns between two biomarkers. In this paper we prove the uniform integrability of the ordinary least squares estimators of a linear regression model, under suitable assumptions on the design matrix and the moments of the errors. Further, we prove the convergence of the moments of the estimators to the corresponding moments of their asymptotic distribution, and study the rate of the moment convergence. The canonical central limit theorem corresponds to the simplest linear regression model. We investigate the rate of the moment convergence in canonical central limit theorem proving a sharp improvement of von Bahr's (1965) theorem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call