ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.09327
27
5

Tight Regret Bounds for Noisy Optimization of a Brownian Motion

25 January 2020
Zexin Wang
Vincent Y. F. Tan
Jonathan Scarlett
ArXivPDFHTML
Abstract

We consider the problem of Bayesian optimization of a one-dimensional Brownian motion in which the TTT adaptively chosen observations are corrupted by Gaussian noise. We show that as the smallest possible expected cumulative regret and the smallest possible expected simple regret scale as Ω(σT/log⁡(T))∩O(σT⋅log⁡T)\Omega(\sigma\sqrt{T / \log (T)}) \cap \mathcal{O}(\sigma\sqrt{T} \cdot \log T)Ω(σT/log(T)​)∩O(σT​⋅logT) and Ω(σ/Tlog⁡(T))∩O(σlog⁡T/T)\Omega(\sigma / \sqrt{T \log (T)}) \cap \mathcal{O}(\sigma\log T / \sqrt{T})Ω(σ/Tlog(T)​)∩O(σlogT/T​) respectively, where σ2\sigma^2σ2 is the noise variance. Thus, our upper and lower bounds are tight up to a factor of O((log⁡T)1.5)\mathcal{O}( (\log T)^{1.5} )O((logT)1.5). The upper bound uses an algorithm based on confidence bounds and the Markov property of Brownian motion (among other useful properties), and the lower bound is based on a reduction to binary hypothesis testing.

View on arXiv
Comments on this paper