ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1312.6114
478
16922
v1v2v3v4v5v6v7v8v9v10v11 (latest)

Stochastic Gradient VB and the Variational Auto-Encoder

20 December 2013
Diederik P. Kingma
Max Welling
    BDL
ArXiv (abs)PDFHTML
Abstract

Can we efficiently learn the parameters of directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We introduce an unsupervised on-line learning algorithm that efficiently optimizes the variational lower bound on the marginal likelihood and that, under some mild conditions, even works in the intractable case. The algorithm, Stochastic Gradient Variational Bayes (SGVB), optimizes a probabilistic encoder (also called a recognition model) to approximate the intractable posterior distribution of the latent variables. Crucial is a reparameterization of the variational bound with an independent noise variable, yielding a stochastic objective function which can be jointly optimized w.r.t. variational and generative parameters using standard gradient-based stochastic optimization methods. Theoretical advantages are reflected in experimental results.

View on arXiv
Comments on this paper