Bayesian model selection in finite mixtures by marginal density decompositions

Hemant Ishwaran, Lancelot F. James, Jiayang Sun

Research output: Contribution to journalArticlepeer-review

61 Scopus citations


We consider the problem of estimating the number of components d and the unknown mixing distribution in a finite mixture model, in which d is bounded by some fixed finite number N. Our approach relies on the use of a prior over the space of mixing distributions with at most N components. By decomposing the resulting marginal density under this prior, we discover a weighted Bayes factor method for consistently estimating d that can be implemented by an iid generalized weighted Chinese restaurant (GWCR) Monte Carlo algorithm. We also discuss a Gibbs sampling method (the blocked Gibbs sampler) for estimating d and also the mixing distribution. We show that our resulting posterior is consistent and achieves the frequentist optimal Op(n-1/4) rate of estimation. We compare the performance of the new GWCR model selection procedure with that of the Akaike information criterion and the Bayes information criterion implemented through an EM algorithm. Applications of our methods to five real datasets and simulations are considered.

Original languageEnglish (US)
Pages (from-to)1316-1332
Number of pages17
JournalJournal of the American Statistical Association
Issue number456
StatePublished - Dec 1 2001
Externally publishedYes


  • Blocked Gibbs sampler
  • Dirichlet prior
  • Generalized weighted Chinese restaurant
  • Identification
  • Partition
  • Uniformly exponentially consistent test
  • Weighted Bayes factor

ASJC Scopus subject areas

  • Mathematics(all)
  • Statistics and Probability


Dive into the research topics of 'Bayesian model selection in finite mixtures by marginal density decompositions'. Together they form a unique fingerprint.

Cite this