ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.00728
11
22

Higher Order Langevin Monte Carlo Algorithm

2 August 2018
Sotirios Sabanis
Ying Zhang
ArXivPDFHTML
Abstract

A new (unadjusted) Langevin Monte Carlo (LMC) algorithm with improved rates in total variation and in Wasserstein distance is presented. All these are obtained in the context of sampling from a target distribution π\piπ that has a density π^\hat{\pi}π^ on Rd\mathbb{R}^dRd known up to a normalizing constant. Moreover, −log⁡π^-\log \hat{\pi}−logπ^ is assumed to have a locally Lipschitz gradient and its third derivative is locally H\"{o}lder continuous with exponent β∈(0,1]\beta \in (0,1]β∈(0,1]. Non-asymptotic bounds are obtained for the convergence to stationarity of the new sampling method with convergence rate 1+β/21+ \beta/21+β/2 in Wasserstein distance, while it is shown that the rate is 1 in total variation even in the absence of convexity. Finally, in the case where −log⁡π^-\log \hat{\pi}−logπ^ is strongly convex and its gradient is Lipschitz continuous, explicit constants are provided.

View on arXiv
Comments on this paper