We consider the model (subset) selection problem for linear regression. Although hypothesis testing and model selection are two different approaches, there are similarities between them. In this article we combine these two approaches together and propose a particular choice of the penalty parameter in the generalized information criterion (GIC), which leads to a model selection procedure that inherits good properties from both approaches, i.e., its overfitting and underfitting probabilities converge to 0 as the sample size n→∞ and, when n is fixed, its overfitting probability is controlled to be approximately under a pre-assigned level of significance.
- Hypothesis testing
- Information criteria* Linear regression
- Prediction error
ASJC Scopus subject areas
- Statistics, Probability and Uncertainty
- Applied Mathematics
- Statistics and Probability