ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.08311
16
23

Single Trajectory Nonparametric Learning of Nonlinear Dynamics

16 February 2022
Ingvar M. Ziemann
H. Sandberg
Nikolai Matni
ArXivPDFHTML
Abstract

Given a single trajectory of a dynamical system, we analyze the performance of the nonparametric least squares estimator (LSE). More precisely, we give nonasymptotic expected l2l^2l2-distance bounds between the LSE and the true regression function, where expectation is evaluated on a fresh, counterfactual, trajectory. We leverage recently developed information-theoretic methods to establish the optimality of the LSE for nonparametric hypotheses classes in terms of supremum norm metric entropy and a subgaussian parameter. Next, we relate this subgaussian parameter to the stability of the underlying process using notions from dynamical systems theory. When combined, these developments lead to rate-optimal error bounds that scale as T−1/(2+q)T^{-1/(2+q)}T−1/(2+q) for suitably stable processes and hypothesis classes with metric entropy growth of order δ−q\delta^{-q}δ−q. Here, TTT is the length of the observed trajectory, δ∈R+\delta \in \mathbb{R}_+δ∈R+​ is the packing granularity and q∈(0,2)q\in (0,2)q∈(0,2) is a complexity term. Finally, we specialize our results to a number of scenarios of practical interest, such as Lipschitz dynamics, generalized linear models, and dynamics described by functions in certain classes of Reproducing Kernel Hilbert Spaces (RKHS).

View on arXiv
Comments on this paper