278

Tight Regret Bounds for Noisy Optimization of a Brownian Motion

IEEE Transactions on Signal Processing (TSP), 2020
Abstract

We consider the problem of Bayesian optimization of a one-dimensional Brownian motion in which the TT adaptively chosen observations are corrupted by Gaussian noise. We show that as the smallest possible expected simple regret and the smallest possible expected cumulative regret scale as Ω(1/Tlog(T))O(logT/T)\Omega(1 / \sqrt{T \log (T)}) \cap \mathcal{O}(\log T / \sqrt{T}) and Ω(T/log(T))O(TlogT)\Omega(\sqrt{T / \log (T)}) \cap \mathcal{O}(\sqrt{T} \cdot \log T) respectively. Thus, our upper and lower bounds are tight up to a factor of O((logT)1.5)\mathcal{O}( (\log T)^{1.5} ). The upper bound uses an algorithm based on confidence bounds and the Markov property of Brownian motion, and the lower bound is based on a reduction to binary hypothesis testing.

View on arXiv
Comments on this paper