ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.03594
16
3

Online nonparametric regression with Sobolev kernels

6 February 2021
O. Zadorozhnyi
Pierre Gaillard
Sébastien Gerchinovitz
Alessandro Rudi
ArXivPDFHTML
Abstract

In this work we investigate the variation of the online kernelized ridge regression algorithm in the setting of d−d-d−dimensional adversarial nonparametric regression. We derive the regret upper bounds on the classes of Sobolev spaces Wpβ(X)W_{p}^{\beta}(\mathcal{X})Wpβ​(X), p≥2,β>dpp\geq 2, \beta>\frac{d}{p}p≥2,β>pd​. The upper bounds are supported by the minimax regret analysis, which reveals that in the cases β>d2\beta> \frac{d}{2}β>2d​ or p=∞p=\inftyp=∞ these rates are (essentially) optimal. Finally, we compare the performance of the kernelized ridge regression forecaster to the known non-parametric forecasters in terms of the regret rates and their computational complexity as well as to the excess risk rates in the setting of statistical (i.i.d.) nonparametric regression.

View on arXiv
Comments on this paper