Fitting a linear-linear piecewise growth mixture model with unknown knots: A comparison of two common approaches to inference

Nidhi Kohli, John Hughes, Chun Wang, Cengiz Zopluoglu, Mark L. Davison

Research output: Contribution to journalArticle

18 Citations (Scopus)

Abstract

A linear-linear piecewise growth mixture model (PGMM) is appropriate for analyzing segmented (disjointed) change in individual behavior over time, where the data come from a mixture of 2 or more latent classes, and the underlying growth trajectories in the different segments of the developmental process within each latent class are linear. A PGMM allows the knot (change point), the time of transition from 1 phase (segment) to another, to be estimated (when it is not known a priori) along with the other model parameters. To assist researchers in deciding which estimation method is most advantageous for analyzing this kind of mixture data, the current research compares 2 popular approaches to inference for PGMMs: maximum likelihood (ML) via an expectation-maximization (EM) algorithm, and Markov chain Monte Carlo (MCMC) for Bayesian inference. Monte Carlo simulations were carried out to investigate and compare the ability of the 2 approaches to recover the true parameters in linear-linear PGMMs with unknown knots. The results show that MCMC for Bayesian inference outperformed ML via EM in nearly every simulation scenario. Real data examples are also presented, and the corresponding computer codes for model fitting are provided in the Appendix to aid practitioners who wish to apply this class of models.

Original languageEnglish (US)
Pages (from-to)259-275
Number of pages17
JournalPsychological Methods
Volume20
Issue number2
DOIs
StatePublished - Jan 1 2015

Fingerprint

Markov Chains
Growth
Phase Transition
Computer Simulation
Research Personnel
Research

Keywords

  • Bayesian
  • Finite mixture
  • Longitudinal data
  • Maximum likelihood
  • Piecewise function

ASJC Scopus subject areas

  • Psychology (miscellaneous)

Cite this

Fitting a linear-linear piecewise growth mixture model with unknown knots : A comparison of two common approaches to inference. / Kohli, Nidhi; Hughes, John; Wang, Chun; Zopluoglu, Cengiz; Davison, Mark L.

In: Psychological Methods, Vol. 20, No. 2, 01.01.2015, p. 259-275.

Research output: Contribution to journalArticle

@article{5e2f37a9b3a74a5882e3b9572dbb5aad,
title = "Fitting a linear-linear piecewise growth mixture model with unknown knots: A comparison of two common approaches to inference",
abstract = "A linear-linear piecewise growth mixture model (PGMM) is appropriate for analyzing segmented (disjointed) change in individual behavior over time, where the data come from a mixture of 2 or more latent classes, and the underlying growth trajectories in the different segments of the developmental process within each latent class are linear. A PGMM allows the knot (change point), the time of transition from 1 phase (segment) to another, to be estimated (when it is not known a priori) along with the other model parameters. To assist researchers in deciding which estimation method is most advantageous for analyzing this kind of mixture data, the current research compares 2 popular approaches to inference for PGMMs: maximum likelihood (ML) via an expectation-maximization (EM) algorithm, and Markov chain Monte Carlo (MCMC) for Bayesian inference. Monte Carlo simulations were carried out to investigate and compare the ability of the 2 approaches to recover the true parameters in linear-linear PGMMs with unknown knots. The results show that MCMC for Bayesian inference outperformed ML via EM in nearly every simulation scenario. Real data examples are also presented, and the corresponding computer codes for model fitting are provided in the Appendix to aid practitioners who wish to apply this class of models.",
keywords = "Bayesian, Finite mixture, Longitudinal data, Maximum likelihood, Piecewise function",
author = "Nidhi Kohli and John Hughes and Chun Wang and Cengiz Zopluoglu and Davison, {Mark L.}",
year = "2015",
month = "1",
day = "1",
doi = "10.1037/met0000034",
language = "English (US)",
volume = "20",
pages = "259--275",
journal = "Psychological Methods",
issn = "1082-989X",
publisher = "American Psychological Association Inc.",
number = "2",

}

TY - JOUR

T1 - Fitting a linear-linear piecewise growth mixture model with unknown knots

T2 - A comparison of two common approaches to inference

AU - Kohli, Nidhi

AU - Hughes, John

AU - Wang, Chun

AU - Zopluoglu, Cengiz

AU - Davison, Mark L.

PY - 2015/1/1

Y1 - 2015/1/1

N2 - A linear-linear piecewise growth mixture model (PGMM) is appropriate for analyzing segmented (disjointed) change in individual behavior over time, where the data come from a mixture of 2 or more latent classes, and the underlying growth trajectories in the different segments of the developmental process within each latent class are linear. A PGMM allows the knot (change point), the time of transition from 1 phase (segment) to another, to be estimated (when it is not known a priori) along with the other model parameters. To assist researchers in deciding which estimation method is most advantageous for analyzing this kind of mixture data, the current research compares 2 popular approaches to inference for PGMMs: maximum likelihood (ML) via an expectation-maximization (EM) algorithm, and Markov chain Monte Carlo (MCMC) for Bayesian inference. Monte Carlo simulations were carried out to investigate and compare the ability of the 2 approaches to recover the true parameters in linear-linear PGMMs with unknown knots. The results show that MCMC for Bayesian inference outperformed ML via EM in nearly every simulation scenario. Real data examples are also presented, and the corresponding computer codes for model fitting are provided in the Appendix to aid practitioners who wish to apply this class of models.

AB - A linear-linear piecewise growth mixture model (PGMM) is appropriate for analyzing segmented (disjointed) change in individual behavior over time, where the data come from a mixture of 2 or more latent classes, and the underlying growth trajectories in the different segments of the developmental process within each latent class are linear. A PGMM allows the knot (change point), the time of transition from 1 phase (segment) to another, to be estimated (when it is not known a priori) along with the other model parameters. To assist researchers in deciding which estimation method is most advantageous for analyzing this kind of mixture data, the current research compares 2 popular approaches to inference for PGMMs: maximum likelihood (ML) via an expectation-maximization (EM) algorithm, and Markov chain Monte Carlo (MCMC) for Bayesian inference. Monte Carlo simulations were carried out to investigate and compare the ability of the 2 approaches to recover the true parameters in linear-linear PGMMs with unknown knots. The results show that MCMC for Bayesian inference outperformed ML via EM in nearly every simulation scenario. Real data examples are also presented, and the corresponding computer codes for model fitting are provided in the Appendix to aid practitioners who wish to apply this class of models.

KW - Bayesian

KW - Finite mixture

KW - Longitudinal data

KW - Maximum likelihood

KW - Piecewise function

UR - http://www.scopus.com/inward/record.url?scp=84929509342&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84929509342&partnerID=8YFLogxK

U2 - 10.1037/met0000034

DO - 10.1037/met0000034

M3 - Article

C2 - 25867487

AN - SCOPUS:84929509342

VL - 20

SP - 259

EP - 275

JO - Psychological Methods

JF - Psychological Methods

SN - 1082-989X

IS - 2

ER -