370

Optimal estimation of Gaussian DAG models

International Conference on Artificial Intelligence and Statistics (AISTATS), 2022
Main:10 Pages
2 Figures
Bibliography:3 Pages
2 Tables
Appendix:8 Pages
Abstract

We study the optimal sample complexity of learning a Gaussian directed acyclic graph (DAG) from observational data. Our main result establishes the minimax optimal sample complexity for learning the structure of a linear Gaussian DAG model with equal variances to be nqlog(d/q)n\asymp q\log(d/q), where qq is the maximum number of parents and dd is the number of nodes. We further make comparisons with the classical problem of learning (undirected) Gaussian graphical models, showing that under the equal variance assumption, these two problems share the same optimal sample complexity. In other words, at least for Gaussian models with equal error variances, learning a directed graphical model is not more difficult than learning an undirected graphical model. Our results also extend to more general identification assumptions as well as subgaussian errors.

View on arXiv
Comments on this paper