ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.00695
41
31

A Universal Marginalizer for Amortized Inference in Generative Models

2 November 2017
Laura Douglas
Iliyan Zarov
Kostis Gourgoulias
Chris Lucas
Chris Hart
Adam Baker
M. Sahani
Yura N. Perov
Saurabh Johri
    UQCV
    CML
ArXivPDFHTML
Abstract

We consider the problem of inference in a causal generative model where the set of available observations differs between data instances. We show how combining samples drawn from the graphical model with an appropriate masking function makes it possible to train a single neural network to approximate all the corresponding conditional marginal distributions and thus amortize the cost of inference. We further demonstrate that the efficiency of importance sampling may be improved by basing proposals on the output of the neural network. We also outline how the same network can be used to generate samples from an approximate joint posterior via a chain decomposition of the graph.

View on arXiv
Comments on this paper