Ridge Fusion in Statistical Learning

Bradley Price, Charles J. Geyer, Adam J. Rothman

Research output: Contribution to journalArticle

10 Scopus citations

Abstract

This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.

Original languageEnglish (US)
Pages (from-to)439-454
Number of pages16
JournalJournal of Computational and Graphical Statistics
Volume24
Issue number2
DOIs
StatePublished - Apr 3 2015

Keywords

  • Discriminant analysis
  • Joint inverse covariance matrix estimation
  • Model-based clustering
  • Semi-supervised learning

ASJC Scopus subject areas

  • Discrete Mathematics and Combinatorics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'Ridge Fusion in Statistical Learning'. Together they form a unique fingerprint.

  • Cite this