### Abstract

We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.

Original language | English (US) |
---|---|

Pages (from-to) | 1081-1089 |

Number of pages | 9 |

Journal | Advances in Neural Information Processing Systems |

Volume | 2015-January |

State | Published - Jan 1 2015 |

Externally published | Yes |

Event | 29th Annual Conference on Neural Information Processing Systems, NIPS 2015 - Montreal, Canada Duration: Dec 7 2015 → Dec 12 2015 |

### ASJC Scopus subject areas

- Computer Networks and Communications
- Information Systems
- Signal Processing

## Fingerprint Dive into the research topics of 'Non-convex statistical optimization for sparse tensor graphical model'. Together they form a unique fingerprint.

## Cite this

*Advances in Neural Information Processing Systems*,

*2015-January*, 1081-1089.