Abstract

Abstract We consider the model (subset) selection problem for linear regression. Although hypothesis testing and model selection are two different approaches, there are similarities between them. In this article we combine these two approaches together and propose a particular choice of the penalty parameter in the generalized information criterion (GIC), which leads to a model selection procedure that inherits good properties from both approaches, i.e., its overfitting and underfitting probabilities converge to 0 as the sample size n→∞ and, when n is fixed, its overfitting probability is controlled to be approximately under a pre-assigned level of significance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call