Stability enhanced large-margin classifier selection

Will Wei Sun, Guang Cheng, Yufeng Liu

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Stability is an important aspect of a classification procedure as unstable predictions can potentially reduce users' trust in a classification system and harm the reproducibility of scientific conclusions. We introduce a concept of classification instability, decision boundary instability (DBI), and incorporate it with the generalization error (GE) as a standard for selecting the most accurate and stable classifier. For this, we implement a two-stage algorithm: (i) select a subset of classifiers whose estimated GEs are not significantly different from the minimal estimated GE among all the candidate classifiers; (ii) take the optimal classifier to be the one achieving the minimal DBI among the subset selected in stage (i). This selection principle applies to both linear and nonlinear classifiers. Large-margin classifiers are used as a prototypical example to illustrate this idea. Our selection method is shown to be consistent in the sense that the optimal classifier simultaneously achieves the minimal GE and the minimal DBI. Various simulations and examples further demonstrate the advantage of our method over alternative approaches.

Original languageEnglish (US)
Pages (from-to)1-25
Number of pages25
JournalStatistica Sinica
Issue number1
StatePublished - Jan 2018


  • Asymptotic normality
  • Large-margin
  • Model selection
  • Selection consistency
  • Stability

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Stability enhanced large-margin classifier selection'. Together they form a unique fingerprint.

Cite this