ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.12835
40
14

Targeted Separation and Convergence with Kernel Discrepancies

26 September 2022
Alessandro Barp
Carl-Johann Simon-Gabriel
Mark Girolami
Lester W. Mackey
ArXivPDFHTML
Abstract

Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution approximation, and variational inference. In each setting, these kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or even (ii) control weak convergence to P. In this article we derive new sufficient and necessary conditions to ensure (i) and (ii). For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels and for controlling convergence with bounded kernels. We use these results on Rd\mathbb{R}^dRd to substantially broaden the known conditions for KSD separation and convergence control and to develop the first KSDs known to exactly metrize weak convergence to P. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.

View on arXiv
@article{barp2025_2209.12835,
  title={ Targeted Separation and Convergence with Kernel Discrepancies },
  author={ Alessandro Barp and Carl-Johann Simon-Gabriel and Mark Girolami and Lester Mackey },
  journal={arXiv preprint arXiv:2209.12835},
  year={ 2025 }
}
Comments on this paper