Abstract

The generalized information criterion (GIC) selects a linear regression model by minimizing the sum of squared residuals plus a penalty parameter λ times a linear function of the model dimension. It is known that the GIC is asymptotically consistent in the sense that the error probability of selecting a non-optimal model by the GIC converges to zero when λ→∞ (as the sample size increases to ∞) at a certain rate. In the present paper we establish some convergence rates for the error probabilities of the GIC, in terms of λ and the order of the design matrix. The rates obtained here are sharper than the existing ones in the literature when the distribution of the response variable is non-normal. A discussion of the choice of the penalty parameter λ is also given.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.