LIVEJoin the current RTAI Connect sessionJoin now

91
41

Convergence of Langevin Monte Carlo in Chi-Squared and Renyi Divergence

Abstract

We study sampling from a target distribution ν=ef\nu_* = e^{-f} using the unadjusted Langevin Monte Carlo (LMC) algorithm when the potential ff satisfies a strong dissipativity condition and it is first-order smooth with a Lipschitz gradient. We prove that, initialized with a Gaussian random vector that has sufficiently small variance, iterating the LMC algorithm for O~(λ2dϵ1)\widetilde{\mathcal{O}}(\lambda^2 d\epsilon^{-1}) steps is sufficient to reach ϵ\epsilon-neighborhood of the target in both Chi-squared and Renyi divergence, where λ\lambda is the logarithmic Sobolev constant of ν\nu_*. Our results do not require warm-start to deal with the exponential dimension dependency in Chi-squared divergence at initialization. In particular, for strongly convex and first-order smooth potentials, we show that the LMC algorithm achieves the rate estimate O~(dϵ1)\widetilde{\mathcal{O}}(d\epsilon^{-1}) which improves the previously known rates in both of these metrics, under the same assumptions. Translating this rate to other metrics, our results also recover the state-of-the-art rate estimates in KL divergence, total variation and 22-Wasserstein distance in the same setup. Finally, as we rely on the logarithmic Sobolev inequality, our framework covers a range of non-convex potentials that are first-order smooth and exhibit strong convexity outside of a compact region.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.