13
1

Learning linear dynamical systems under convex constraints

Abstract

We consider the problem of finite-time identification of linear dynamical systems from TT samples of a single trajectory. Recent results have predominantly focused on the setup where no structural assumption is made on the system matrix ARn×nA^* \in \mathbb{R}^{n \times n}, and have consequently analyzed the ordinary least squares (OLS) estimator in detail. We assume prior structural information on AA^* is available, which can be captured in the form of a convex set K\mathcal{K} containing AA^*. For the solution of the ensuing constrained least squares estimator, we derive non-asymptotic error bounds in the Frobenius norm that depend on the local size of K\mathcal{K} at AA^*. To illustrate the usefulness of these results, we instantiate them for four examples, namely when (i) AA^* is sparse and K\mathcal{K} is a suitably scaled 1\ell_1 ball; (ii) K\mathcal{K} is a subspace; (iii) K\mathcal{K} consists of matrices each of which is formed by sampling a bivariate convex function on a uniform n×nn \times n grid (convex regression); (iv) K\mathcal{K} consists of matrices each row of which is formed by uniform sampling (with step size 1/T1/T) of a univariate Lipschitz function. In all these situations, we show that AA^* can be reliably estimated for values of TT much smaller than what is needed for the unconstrained setting.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.