Abstract

In this paper, conditions for the consistent selection of a subset from a large set of potential regressors are derived. It is assumed that the number of potential regressors increases as the sample size increases and, in addition, that the regressors are orthogonal. Subset-selection criteria are proposed which satisfy these conditions. These criteria do not depend on any tuning parameters. It is also shown that some other criteria, which include AIC and BIC, violate these conditions. Simulation studies with different sample sizes and large sets of orthogonal regressors are conducted to compare the performance of the new criteria with that of conventional model-selection criteria. The results of these simulation studies corroborate the theoretical findings. In large samples, the consistent criteria always make the correct decisions. They include all genuine regressors and exclude the others. In contrast, AIC tends to select always the maximum number of regressors and BIC is also not competitive when the number of potential regressors grows too fast.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.