Ridge Fusion in Statistical Learning

Bradley S. Price, Charles J. Geyer, Adam J. Rothman

Research output: Contribution to journalArticlepeer-review

14 Scopus citations


This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.

Original languageEnglish (US)
Pages (from-to)439-454
Number of pages16
JournalJournal of Computational and Graphical Statistics
Issue number2
StatePublished - Apr 3 2015


  • Discriminant analysis
  • Joint inverse covariance matrix estimation
  • Model-based clustering
  • Semi-supervised learning

ASJC Scopus subject areas

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Ridge Fusion in Statistical Learning'. Together they form a unique fingerprint.

Cite this