ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.05716
24
0

Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature and Bayesian Optimization

11 January 2024
Xu Cai
Jonathan Scarlett
ArXivPDFHTML
Abstract

In this paper, we study the problem of estimating the normalizing constant ∫e−λf(x)dx\int e^{-\lambda f(x)}dx∫e−λf(x)dx through queries to the black-box function fff, where fff belongs to a reproducing kernel Hilbert space (RKHS), and λ\lambdaλ is a problem parameter. We show that to estimate the normalizing constant within a small relative error, the level of difficulty depends on the value of λ\lambdaλ: When λ\lambdaλ approaches zero, the problem is similar to Bayesian quadrature (BQ), while when λ\lambdaλ approaches infinity, the problem is similar to Bayesian optimization (BO). More generally, the problem varies between BQ and BO. We find that this pattern holds true even when the function evaluations are noisy, bringing new aspects to this topic. Our findings are supported by both algorithm-independent lower bounds and algorithmic upper bounds, as well as simulation studies conducted on a variety of benchmark functions.

View on arXiv
Comments on this paper