130
76

Convergence Properties of Kronecker Graphical Lasso Algorithms

Abstract

This report presents a thorough convergence analysis of Kronecker graphical lasso (KGLasso) algorithms for estimating the covariance of an i.i.d. Gaussian random sample under a sparse Kronecker-product covariance model. The KGlasso model, originally called the transposable regularized covariance model by Allen {\it et al} \cite{AllenTib10}, implements a pair of 1\ell_1 penalties on each Kronecker factor to enforce sparsity in the covariance estimator. The KGlasso algorithm generalizes Glasso, introduced by Yuan and Lin \cite{YL07} and Banerjee {\it et al} \cite{ModelSel}, to estimate covariances having Kronecker product form. It also generalizes the unpenalized ML flip-flop (FF) algorithm of Werner {\it et al} \cite{EstCovMatKron} to estimation of sparse Kronecker factors. We establish high dimensional rates of convergence to the true covariance as both the number of samples and the number of variables go to infinity. Our results establish that KGlasso has significantly faster asymptotic convergence than Glasso and FF. Simulations are presented that validate the results of our analysis. For example, for a sparse 10,000×10,00010,000 \times 10,000 covariance matrix equal to the Kronecker product of two 100×100100 \times 100 matrices, the root mean squared error of the inverse covariance estimate using FF is 3.5 times larger than that obtainable using KGlasso.

View on arXiv
Comments on this paper