Non-convex statistical optimization for sparse tensor graphical model

Wei Sun, Zhaoran Wang, Han Liu, Guang Cheng

Research output: Contribution to journalConference article

5 Citations (Scopus)

Abstract

We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.

Original languageEnglish (US)
Pages (from-to)1081-1089
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume2015-January
StatePublished - Jan 1 2015
Externally publishedYes
Event29th Annual Conference on Neural Information Processing Systems, NIPS 2015 - Montreal, Canada
Duration: Dec 7 2015Dec 12 2015

Fingerprint

Tensors
Maximum likelihood estimation
Normal distribution
Recovery

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Non-convex statistical optimization for sparse tensor graphical model. / Sun, Wei; Wang, Zhaoran; Liu, Han; Cheng, Guang.

In: Advances in Neural Information Processing Systems, Vol. 2015-January, 01.01.2015, p. 1081-1089.

Research output: Contribution to journalConference article

Sun, Wei ; Wang, Zhaoran ; Liu, Han ; Cheng, Guang. / Non-convex statistical optimization for sparse tensor graphical model. In: Advances in Neural Information Processing Systems. 2015 ; Vol. 2015-January. pp. 1081-1089.
@article{f5f44734d968424cbabfbe1ded830a02,
title = "Non-convex statistical optimization for sparse tensor graphical model",
abstract = "We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.",
author = "Wei Sun and Zhaoran Wang and Han Liu and Guang Cheng",
year = "2015",
month = "1",
day = "1",
language = "English (US)",
volume = "2015-January",
pages = "1081--1089",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Non-convex statistical optimization for sparse tensor graphical model

AU - Sun, Wei

AU - Wang, Zhaoran

AU - Liu, Han

AU - Cheng, Guang

PY - 2015/1/1

Y1 - 2015/1/1

N2 - We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.

AB - We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.

UR - http://www.scopus.com/inward/record.url?scp=84965106699&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84965106699&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84965106699

VL - 2015-January

SP - 1081

EP - 1089

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -