Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.17886
Cited By
v1
v2
v3 (latest)
Zeroth-Order Sampling Methods for Non-Log-Concave Distributions: Alleviating Metastability by Denoising Diffusion
27 February 2024
Ye He
Kevin Rojas
Molei Tao
DiffM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Zeroth-Order Sampling Methods for Non-Log-Concave Distributions: Alleviating Metastability by Denoising Diffusion"
41 / 41 papers shown
Title
Improving the evaluation of samplers on multi-modal targets
Louis Grenioux
Maxence Noble
Marylou Gabrié
439
0
0
11 Apr 2025
Provable Convergence and Limitations of Geometric Tempering for Langevin Dynamics
Omar Chehab
Anna Korba
Austin Stromme
Adrien Vacher
139
4
0
13 Oct 2024
Stochastic Localization via Iterative Posterior Sampling
Louis Grenioux
Maxence Noble
Marylou Gabrié
Alain Durmus
DiffM
78
16
0
16 Feb 2024
Faster Sampling without Isoperimetry via Diffusion-based Monte Carlo
Xunpeng Huang
Difan Zou
Hanze Dong
Yian Ma
Tong Zhang
DiffM
52
12
0
12 Jan 2024
Nearly
d
d
d
-Linear Convergence Bounds for Diffusion Models via Stochastic Localization
Joe Benton
Valentin De Bortoli
Arnaud Doucet
George Deligiannidis
DiffM
85
116
0
07 Aug 2023
Reverse Diffusion Monte Carlo
Xunpeng Huang
Hanze Dong
Yi Hao
Yi-An Ma
Tong Zhang
DiffM
72
28
0
05 Jul 2023
Improved sampling via learned diffusions
Lorenz Richter
Julius Berner
DiffM
97
63
0
03 Jul 2023
Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative Models
Gen Li
Yuting Wei
Yuxin Chen
Yuejie Chi
DiffM
80
62
0
15 Jun 2023
A Simple Proof of the Mixing of Metropolis-Adjusted Langevin Algorithm under Smoothness and Isoperimetry
Yuansi Chen
Khashayar Gatmiry
58
6
0
08 Apr 2023
Improved Bound for Mixing Time of Parallel Tempering
Holden Lee
Zeyu Shen
117
4
0
03 Apr 2023
Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation
David Holzmüller
Francis R. Bach
78
9
0
06 Mar 2023
Mean-Square Analysis of Discretized Itô Diffusions for Heavy-tailed Sampling
Ye He
Tyler Farghly
Krishnakumar Balasubramanian
Murat A. Erdogdu
76
4
0
01 Mar 2023
Denoising Diffusion Samplers
Francisco Vargas
Will Grathwohl
Arnaud Doucet
DiffM
61
89
0
27 Feb 2023
Improved dimension dependence of a proximal algorithm for sampling
JiaoJiao Fan
Bo Yuan
Yongxin Chen
74
25
0
20 Feb 2023
Regularized Stein Variational Gradient Flow
Ye He
Krishnakumar Balasubramanian
Bharath K. Sriperumbudur
Jianfeng Lu
OT
58
12
0
15 Nov 2022
Improved Analysis of Score-based Generative Modeling: User-Friendly Bounds under Minimal Smoothness Assumptions
Hongrui Chen
Holden Lee
Jianfeng Lu
DiffM
66
142
0
03 Nov 2022
Convergence of the Inexact Langevin Algorithm and Score-based Generative Models in KL Divergence
Kaylee Yingxi Yang
Andre Wibisono
77
12
0
02 Nov 2022
Convergence of score-based generative modeling for general data distributions
Holden Lee
Jianfeng Lu
Yixin Tan
DiffM
245
137
0
26 Sep 2022
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
Sitan Chen
Sinho Chewi
Jungshian Li
Yuanzhi Li
Adil Salim
Anru R. Zhang
DiffM
217
276
0
22 Sep 2022
Convergence of denoising diffusion models under the manifold hypothesis
Valentin De Bortoli
DiffM
71
170
0
10 Aug 2022
Localization Schemes: A Framework for Proving Mixing Bounds for Markov Chains
Yuansi Chen
Ronen Eldan
46
81
0
08 Mar 2022
A Proximal Algorithm for Sampling
Jiaming Liang
Yongxin Chen
76
18
0
28 Feb 2022
Improved analysis for a proximal algorithm for sampling
Yongxin Chen
Sinho Chewi
Adil Salim
Andre Wibisono
100
58
0
13 Feb 2022
Towards a Theory of Non-Log-Concave Sampling: First-Order Stationarity Guarantees for Langevin Monte Carlo
Krishnakumar Balasubramanian
Sinho Chewi
Murat A. Erdogdu
Adil Salim
Matthew Shunshi Zhang
95
65
0
10 Feb 2022
Path Integral Sampler: a stochastic control approach for sampling
Qinsheng Zhang
Yongxin Chen
DiffM
100
117
0
30 Nov 2021
Bayesian Learning via Neural Schrödinger-Föllmer Flows
Francisco Vargas
Andrius Ovsianas
David Fernandes
Mark Girolami
Neil D. Lawrence
Nikolas Nusken
BDL
89
49
0
20 Nov 2021
The Mirror Langevin Algorithm Converges with Vanishing Bias
Ruilin Li
Molei Tao
Santosh Vempala
Andre Wibisono
88
37
0
24 Sep 2021
Sqrt(d) Dimension Dependence of Langevin Monte Carlo
Ruilin Li
H. Zha
Molei Tao
60
29
0
08 Sep 2021
A Convergence Theory for SVGD in the Population Limit under Talagrand's Inequality T1
Adil Salim
Lukang Sun
Peter Richtárik
57
20
0
06 Jun 2021
Score-Based Generative Modeling through Stochastic Differential Equations
Yang Song
Jascha Narain Sohl-Dickstein
Diederik P. Kingma
Abhishek Kumar
Stefano Ermon
Ben Poole
DiffM
SyDa
353
6,566
0
26 Nov 2020
On the Ergodicity, Bias and Asymptotic Normality of Randomized Midpoint Sampling Method
Ye He
Krishnakumar Balasubramanian
Murat A. Erdogdu
55
35
0
06 Nov 2020
Structured Logconcave Sampling with a Restricted Gaussian Oracle
Y. Lee
Ruoqi Shen
Kevin Tian
56
72
0
07 Oct 2020
Denoising Diffusion Probabilistic Models
Jonathan Ho
Ajay Jain
Pieter Abbeel
DiffM
718
18,310
0
19 Jun 2020
SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence
Sinho Chewi
Thibaut Le Gouic
Chen Lu
Tyler Maunu
Philippe Rigollet
92
70
0
03 Jun 2020
The Randomized Midpoint Method for Log-Concave Sampling
Ruoqi Shen
Y. Lee
103
118
0
12 Sep 2019
Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices
Santosh Vempala
Andre Wibisono
90
269
0
20 Mar 2019
On sampling from a log-concave density using kinetic Langevin diffusions
A. Dalalyan
L. Riou-Durand
75
158
0
24 Jul 2018
Log-concave sampling: Metropolis-Hastings algorithms are fast
Raaz Dwivedi
Yuansi Chen
Martin J. Wainwright
Bin Yu
66
255
0
08 Jan 2018
User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
A. Dalalyan
Avetik G. Karagulyan
75
297
0
29 Sep 2017
Stein Variational Gradient Descent as Gradient Flow
Qiang Liu
OT
98
277
0
25 Apr 2017
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
Jascha Narain Sohl-Dickstein
Eric A. Weiss
Niru Maheswaranathan
Surya Ganguli
SyDa
DiffM
312
7,016
0
12 Mar 2015
1