ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.13093
10
32

f-Divergence Variational Inference

28 September 2020
Neng Wan
Dapeng Li
N. Hovakimyan
ArXivPDFHTML
Abstract

This paper introduces the fff-divergence variational inference (fff-VI) that generalizes variational inference to all fff-divergences. Initiated from minimizing a crafty surrogate fff-divergence that shares the statistical consistency with the fff-divergence, the fff-VI framework not only unifies a number of existing VI methods, e.g. Kullback-Leibler VI, R\'{e}nyi's α\alphaα-VI, and χ\chiχ-VI, but offers a standardized toolkit for VI subject to arbitrary divergences from fff-divergence family. A general fff-variational bound is derived and provides a sandwich estimate of marginal likelihood (or evidence). The development of the fff-VI unfolds with a stochastic optimization scheme that utilizes the reparameterization trick, importance weighting and Monte Carlo approximation; a mean-field approximation scheme that generalizes the well-known coordinate ascent variational inference (CAVI) is also proposed for fff-VI. Empirical examples, including variational autoencoders and Bayesian neural networks, are provided to demonstrate the effectiveness and the wide applicability of fff-VI.

View on arXiv
Comments on this paper