ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1803.10161
26
102

Stein Points

27 March 2018
W. Chen
Lester W. Mackey
Jackson Gorham
François‐Xavier Briol
Chris J. Oates
ArXivPDFHTML
Abstract

An important task in computational statistics and machine learning is to approximate a posterior distribution p(x)p(x)p(x) with an empirical measure supported on a set of representative points {xi}i=1n\{x_i\}_{i=1}^n{xi​}i=1n​. This paper focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when nnn is small. To this end, we present `Stein Points'. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and p(x)p(x)p(x). Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.

View on arXiv
Comments on this paper