Ridge Fusion in Statistical Learning

Bradley Price, Charles J. Geyer, Adam J. Rothman

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.

Original languageEnglish (US)
Pages (from-to)439-454
Number of pages16
JournalJournal of Computational and Graphical Statistics
Volume24
Issue number2
DOIs
StatePublished - Apr 3 2015

Fingerprint

Model-based Clustering
Statistical Learning
Ridge
Discriminant Analysis
Penalty
Fusion
Semi-supervised Clustering
Coordinate Descent
Penalized Likelihood
Parameter Selection
Likelihood Methods
Shrinkage
Estimate
Tuning
Likelihood
Optimization
Statistical learning
Discriminant analysis
Clustering
Similarity

Keywords

  • Discriminant analysis
  • Joint inverse covariance matrix estimation
  • Model-based clustering
  • Semi-supervised learning

ASJC Scopus subject areas

  • Discrete Mathematics and Combinatorics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

Ridge Fusion in Statistical Learning. / Price, Bradley; Geyer, Charles J.; Rothman, Adam J.

In: Journal of Computational and Graphical Statistics, Vol. 24, No. 2, 03.04.2015, p. 439-454.

Research output: Contribution to journalArticle

Price, Bradley ; Geyer, Charles J. ; Rothman, Adam J. / Ridge Fusion in Statistical Learning. In: Journal of Computational and Graphical Statistics. 2015 ; Vol. 24, No. 2. pp. 439-454.
@article{c863d6d8a26349e799d8b22764033af1,
title = "Ridge Fusion in Statistical Learning",
abstract = "This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.",
keywords = "Discriminant analysis, Joint inverse covariance matrix estimation, Model-based clustering, Semi-supervised learning",
author = "Bradley Price and Geyer, {Charles J.} and Rothman, {Adam J.}",
year = "2015",
month = "4",
day = "3",
doi = "10.1080/10618600.2014.920709",
language = "English (US)",
volume = "24",
pages = "439--454",
journal = "Journal of Computational and Graphical Statistics",
issn = "1061-8600",
publisher = "American Statistical Association",
number = "2",

}

TY - JOUR

T1 - Ridge Fusion in Statistical Learning

AU - Price, Bradley

AU - Geyer, Charles J.

AU - Rothman, Adam J.

PY - 2015/4/3

Y1 - 2015/4/3

N2 - This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.

AB - This article proposes a penalized likelihood method to jointly estimate multiple precision matrices for use in quadratic discriminant analysis (QDA) and model-based clustering. We use a ridge penalty and a ridge fusion penalty to introduce shrinkage and promote similarity between precision matrix estimates. We use blockwise coordinate descent for optimization, and validation likelihood is used for tuning parameter selection. Our method is applied in QDA and semi-supervised model-based clustering.

KW - Discriminant analysis

KW - Joint inverse covariance matrix estimation

KW - Model-based clustering

KW - Semi-supervised learning

UR - http://www.scopus.com/inward/record.url?scp=84931027900&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84931027900&partnerID=8YFLogxK

U2 - 10.1080/10618600.2014.920709

DO - 10.1080/10618600.2014.920709

M3 - Article

VL - 24

SP - 439

EP - 454

JO - Journal of Computational and Graphical Statistics

JF - Journal of Computational and Graphical Statistics

SN - 1061-8600

IS - 2

ER -