47
10

Learning Directed Graphical Models from Gaussian Data

Abstract

In this paper, we introduce a new directed graphical model from Gaussian data: the Gaussian graphical interaction model (GGIM). The development of this model comes from considering stationary Gaussian processes on graphs, and leveraging the equations between the resulting steady-state covariance matrix and the Laplacian matrix representing the interaction graph. Through the presentation of conceptually straightforward theory, we develop the new model and provide interpretations of the edges in the graphical model in terms of statistical measures. We show that when restricted to undirected graphs, the Laplacian matrix representing a GGIM is equivalent to the standard inverse covariance matrix that encodes conditional dependence relationships. Furthermore, our approach leads to a natural definition of directed conditional independence of two elements in a stationary Gaussian process. We demonstrate that the problem of learning sparse GGIMs for a given observation set can be framed as a LASSO problem. By comparison with the problem of inverse covariance estimation, we prove a bound on the difference between the covariance matrix corresponding to a sparse GGIM and the covariance matrix corresponding to the l1l_1-norm penalized maximum log-likelihood estimate. Finally, we consider the problem of learning GGIMs associated with sparse directed conditional dependence relationships. In all, the new model presents a novel perspective on directed relationships between variables and significantly expands on the state of the art in Gaussian graphical modeling.

View on arXiv
Comments on this paper