The GIC for model selection: A hypothesis testing approach

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

We consider the model (subset) selection problem for linear regression. Although hypothesis testing and model selection are two different approaches, there are similarities between them. In this article we combine these two approaches together and propose a particular choice of the penalty parameter in the generalized information criterion (GIC), which leads to a model selection procedure that inherits good properties from both approaches, i.e., its overfitting and underfitting probabilities converge to 0 as the sample size n→∞ and, when n is fixed, its overfitting probability is controlled to be approximately under a pre-assigned level of significance.

Original languageEnglish
Pages (from-to)215-231
Number of pages17
JournalJournal of Statistical Planning and Inference
Volume88
Issue number2
StatePublished - Aug 1 2000
Externally publishedYes

Fingerprint

Information Criterion
Hypothesis Testing
Model Selection
Overfitting
Testing
Subset Selection
Selection Procedures
Linear regression
Penalty
Sample Size
Converge
Information criterion
Model selection
Hypothesis testing

Keywords

  • 62J05
  • Hypothesis testing
  • Information criteria* Linear regression
  • Prediction error

ASJC Scopus subject areas

  • Statistics, Probability and Uncertainty
  • Applied Mathematics
  • Statistics and Probability

Cite this

The GIC for model selection : A hypothesis testing approach. / Shao, Jun; Rao, Jonnagadda S.

In: Journal of Statistical Planning and Inference, Vol. 88, No. 2, 01.08.2000, p. 215-231.

Research output: Contribution to journalArticle

@article{69fd74aa5760401ba20e6217434c5e73,
title = "The GIC for model selection: A hypothesis testing approach",
abstract = "We consider the model (subset) selection problem for linear regression. Although hypothesis testing and model selection are two different approaches, there are similarities between them. In this article we combine these two approaches together and propose a particular choice of the penalty parameter in the generalized information criterion (GIC), which leads to a model selection procedure that inherits good properties from both approaches, i.e., its overfitting and underfitting probabilities converge to 0 as the sample size n→∞ and, when n is fixed, its overfitting probability is controlled to be approximately under a pre-assigned level of significance.",
keywords = "62J05, Hypothesis testing, Information criteria* Linear regression, Prediction error",
author = "Jun Shao and Rao, {Jonnagadda S}",
year = "2000",
month = "8",
day = "1",
language = "English",
volume = "88",
pages = "215--231",
journal = "Journal of Statistical Planning and Inference",
issn = "0378-3758",
publisher = "Elsevier",
number = "2",

}

TY - JOUR

T1 - The GIC for model selection

T2 - A hypothesis testing approach

AU - Shao, Jun

AU - Rao, Jonnagadda S

PY - 2000/8/1

Y1 - 2000/8/1

N2 - We consider the model (subset) selection problem for linear regression. Although hypothesis testing and model selection are two different approaches, there are similarities between them. In this article we combine these two approaches together and propose a particular choice of the penalty parameter in the generalized information criterion (GIC), which leads to a model selection procedure that inherits good properties from both approaches, i.e., its overfitting and underfitting probabilities converge to 0 as the sample size n→∞ and, when n is fixed, its overfitting probability is controlled to be approximately under a pre-assigned level of significance.

AB - We consider the model (subset) selection problem for linear regression. Although hypothesis testing and model selection are two different approaches, there are similarities between them. In this article we combine these two approaches together and propose a particular choice of the penalty parameter in the generalized information criterion (GIC), which leads to a model selection procedure that inherits good properties from both approaches, i.e., its overfitting and underfitting probabilities converge to 0 as the sample size n→∞ and, when n is fixed, its overfitting probability is controlled to be approximately under a pre-assigned level of significance.

KW - 62J05

KW - Hypothesis testing

KW - Information criteria Linear regression

KW - Prediction error

UR - http://www.scopus.com/inward/record.url?scp=0042708473&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0042708473&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0042708473

VL - 88

SP - 215

EP - 231

JO - Journal of Statistical Planning and Inference

JF - Journal of Statistical Planning and Inference

SN - 0378-3758

IS - 2

ER -