ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.12414
19
6

Posterior Matching for Arbitrary Conditioning

28 January 2022
R. Strauss
Junier B. Oliva
    CML
    BDL
ArXivPDFHTML
Abstract

Arbitrary conditioning is an important problem in unsupervised learning, where we seek to model the conditional densities p(xu∣xo)p(\mathbf{x}_u \mid \mathbf{x}_o)p(xu​∣xo​) that underly some data, for all possible non-intersecting subsets o,u⊂{1,…,d}o, u \subset \{1, \dots , d\}o,u⊂{1,…,d}. However, the vast majority of density estimation only focuses on modeling the joint distribution p(x)p(\mathbf{x})p(x), in which important conditional dependencies between features are opaque. We propose a simple and general framework, coined Posterior Matching, that enables Variational Autoencoders (VAEs) to perform arbitrary conditioning, without modification to the VAE itself. Posterior Matching applies to the numerous existing VAE-based approaches to joint density estimation, thereby circumventing the specialized models required by previous approaches to arbitrary conditioning. We find that Posterior Matching is comparable or superior to current state-of-the-art methods for a variety of tasks with an assortment of VAEs (e.g.~discrete, hierarchical, VaDE).

View on arXiv
Comments on this paper