9
2

Conditional Matrix Flows for Gaussian Graphical Models

Abstract

Studying conditional independence among many variables with few observations is a challenging task. Gaussian Graphical Models (GGMs) tackle this problem by encouraging sparsity in the precision matrix through lql_q regularization with q1q\leq1. However, most GMMs rely on the l1l_1 norm because the objective is highly non-convex for sub-l1l_1 pseudo-norms. In the frequentist formulation, the l1l_1 norm relaxation provides the solution path as a function of the shrinkage parameter λ\lambda. In the Bayesian formulation, sparsity is instead encouraged through a Laplace prior, but posterior inference for different λ\lambda requires repeated runs of expensive Gibbs samplers. Here we propose a general framework for variational inference with matrix-variate Normalizing Flow in GGMs, which unifies the benefits of frequentist and Bayesian frameworks. As a key improvement on previous work, we train with one flow a continuum of sparse regression models jointly for all regularization parameters λ\lambda and all lql_q norms, including non-convex sub-l1l_1 pseudo-norms. Within one model we thus have access to (i) the evolution of the posterior for any λ\lambda and any lql_q (pseudo-) norm, (ii) the marginal log-likelihood for model selection, and (iii) the frequentist solution paths through simulated annealing in the MAP limit.

View on arXiv
Comments on this paper