ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.08219
  4. Cited By
Gradients should stay on Path: Better Estimators of the Reverse- and
  Forward KL Divergence for Normalizing Flows

Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows

17 July 2022
Lorenz Vaitl
K. Nicoli
Shinichi Nakajima
Pan Kessel
ArXivPDFHTML

Papers citing "Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows"

4 / 4 papers shown
Title
Stable Training of Normalizing Flows for High-dimensional Variational
  Inference
Stable Training of Normalizing Flows for High-dimensional Variational Inference
Daniel Andrade
BDL
TPM
43
1
0
26 Feb 2024
Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
  Lattice Field Theories
Detecting and Mitigating Mode-Collapse for Flow-based Sampling of Lattice Field Theories
K. Nicoli
Christopher J. Anders
T. Hartung
K. Jansen
Pan Kessel
Shinichi Nakajima
24
22
0
27 Feb 2023
VarGrad: A Low-Variance Gradient Estimator for Variational Inference
VarGrad: A Low-Variance Gradient Estimator for Variational Inference
Lorenz Richter
Ayman Boustati
Nikolas Nusken
Francisco J. R. Ruiz
Ömer Deniz Akyildiz
DRL
127
48
0
20 Oct 2020
On the Difficulty of Unbiased Alpha Divergence Minimization
On the Difficulty of Unbiased Alpha Divergence Minimization
Tomas Geffner
Justin Domke
46
18
0
19 Oct 2020
1