ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.07788
19
45

Random Feature Stein Discrepancies

20 June 2018
Jonathan H. Huggins
Lester W. Mackey
ArXivPDFHTML
Abstract

Computable Stein discrepancies have been deployed for a variety of applications, ranging from sampler selection in posterior inference to approximate Bayesian inference to goodness-of-fit testing. Existing convergence-determining Stein discrepancies admit strong theoretical guarantees but suffer from a computational cost that grows quadratically in the sample size. While linear-time Stein discrepancies have been proposed for goodness-of-fit testing, they exhibit avoidable degradations in testing power -- even when power is explicitly optimized. To address these shortcomings, we introduce feature Stein discrepancies (Φ\PhiΦSDs), a new family of quality measures that can be cheaply approximated using importance sampling. We show how to construct Φ\PhiΦSDs that provably determine the convergence of a sample to its target and develop high-accuracy approximations -- random Φ\PhiΦSDs (RΦ\PhiΦSDs) -- which are computable in near-linear time. In our experiments with sampler selection for approximate posterior inference and goodness-of-fit testing, RΦ\PhiΦSDs perform as well or better than quadratic-time KSDs while being orders of magnitude faster to compute.

View on arXiv
Comments on this paper