363

Approximate Probabilistic Inference with Composed Flows

International Conference on Machine Learning (ICML), 2020
Abstract

We study the problem of probabilistic inference on the joint distribution defined by a normalizing flow model. Given a pre-trained flow model p(x)p(\boldsymbol{x}), we wish to estimate p(x2x1)p(\boldsymbol{x}_2 \mid \boldsymbol{x}_1) for some arbitrary partitioning of the variables x=(x1,x2)\boldsymbol{x} = (\boldsymbol{x}_1, \boldsymbol{x}_2). We first show that this task is computationally hard for a large class of flow models. Motivated by this hardness result, we propose a framework for approximate\textit{approximate} probabilistic inference. Specifically, our method trains a new generative model with the property that its composition with the given model approximates the target conditional distribution. By parametrizing this new distribution as another flow model, we can efficiently train it using variational inference and also handle conditioning under arbitrary differentiable transformations. We experimentally demonstrate that our approach outperforms Langevin Dynamics in terms of sample quality, while requiring much fewer parameters and training time compared to regular variational inference. We further validate the flexibility of our method on a variety of inference tasks with applications to inverse problems.

View on arXiv
Comments on this paper