Abstract

The least squares estimator for the linear regression model is shown to converge to the true parameter vector either with probability one or with probability zero. In the latter case, it either converges to a point not equal to the true parameter with probability one, or it diverges with probability one. These results are shown to hold under weak conditions on the dependent random variable and regressor variables. No additional conditions are placed on the errors. The dependent and regressor variables are assumed to be weakly dependent—in particular, to be strong mixing. The regressors may be fixed or random and must exhibit a certain degree of independent variability. No further assumptions are needed. The model considered allows the number of regressors to increase without bound as the sample size increases. The proof proceeds by extending Kolmogorov's 0-1 law for independent randomvariables to strong mixing random variables.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.