ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1705.08360
45
24
v1v2v3v4v5v6 (latest)

Efficient and principled score estimation

23 May 2017
Danica J. Sutherland
Heiko Strathmann
Michael Arbel
Arthur Gretton
ArXiv (abs)PDFHTML
Abstract

We propose a fast method with statistical guarantees for learning an exponential family density model where the natural parameter is in a reproducing kernel Hilbert space, and may be infinite dimensional. The model is learned by fitting the derivative of the log density, the score, thus avoiding the need to compute a normalization constant. We improved the computational efficiency of an earlier solution with a low-rank, Nystr\"om-like solution. The new solution remains consistent, and is shown to converge in Fisher distance at the same rate as a full-rank solution, with guarantees on the degree of cost and storage reduction. We compare to a popular score learning approach using a denoising autoencoder, in experiments on density estimation and in the construction of an adaptive Hamiltonian Monte Carlo sampler. Apart from the lack of statistical guarantees for the autoencoder, our estimator is more data-efficient when estimating the score, runs faster, and has fewer parameters (which can be tuned in a principled and interpretable way).

View on arXiv
Comments on this paper