ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.01417
16
8

Subgradient Langevin Methods for Sampling from Non-smooth Potentials

2 August 2023
Andreas Habring
M. Holler
T. Pock
ArXivPDFHTML
Abstract

This paper is concerned with sampling from probability distributions π\piπ on Rd\mathbb{R}^dRd admitting a density of the form π(x)∝e−U(x)\pi(x) \propto e^{-U(x)}π(x)∝e−U(x), where U(x)=F(x)+G(Kx)U(x)=F(x)+G(Kx)U(x)=F(x)+G(Kx) with KKK being a linear operator and GGG being non-differentiable. Two different methods are proposed, both employing a subgradient step with respect to G∘KG\circ KG∘K, but, depending on the regularity of FFF, either an explicit or an implicit gradient step with respect to FFF can be implemented. For both methods, non-asymptotic convergence proofs are provided, with improved convergence results for more regular FFF. Further, numerical experiments are conducted for simple 2D examples, illustrating the convergence rates, and for examples of Bayesian imaging, showing the practical feasibility of the proposed methods for high dimensional data.

View on arXiv
Comments on this paper