ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.03364
8
38

Online Forecasting of Total-Variation-bounded Sequences

8 June 2019
Dheeraj Baby
Yu Wang
    AI4TS
ArXivPDFHTML
Abstract

We consider the problem of online forecasting of sequences of length nnn with total-variation at most CnC_nCn​ using observations contaminated by independent σ\sigmaσ-subgaussian noise. We design an O(nlog⁡n)O(n\log n)O(nlogn)-time algorithm that achieves a cumulative square error of O~(n1/3Cn2/3σ4/3+Cn2)\tilde{O}(n^{1/3}C_n^{2/3}\sigma^{4/3} + C_n^2)O~(n1/3Cn2/3​σ4/3+Cn2​) with high probability.We also prove a lower bound that matches the upper bound in all parameters (up to a log⁡(n)\log(n)log(n) factor). To the best of our knowledge, this is the first \emph{polynomial-time} algorithm that achieves the optimal O(n1/3)O(n^{1/3})O(n1/3) rate in forecasting total variation bounded sequences and the first algorithm that \emph{adapts to unknown} CnC_nCn​. Our proof techniques leverage the special localized structure of Haar wavelet basis and the adaptivity to unknown smoothness parameters in the classical wavelet smoothing [Donoho et al., 1998]. We also compare our model to the rich literature of dynamic regret minimization and nonstationary stochastic optimization, where our problem can be treated as a special case. We show that the workhorse in those settings --- online gradient descent and its variants with a fixed restarting schedule --- are instances of a class of \emph{linear forecasters} that require a suboptimal regret of Ω~(n)\tilde{\Omega}(\sqrt{n})Ω~(n​). This implies that the use of more adaptive algorithms is necessary to obtain the optimal rate.

View on arXiv
Comments on this paper