ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.04276
22
33

High-Dimensional Low-Rank Tensor Autoregressive Time Series Modeling

12 January 2021
Di Wang
Yao Zheng
Guodong Li
    AI4TS
ArXivPDFHTML
Abstract

Modern technological advances have enabled an unprecedented amount of structured data with complex temporal dependence, urging the need for new methods to efficiently model and forecast high-dimensional tensor-valued time series. This paper provides a new modeling framework to accomplish this task via autoregression (AR). By considering a low-rank Tucker decomposition for the transition tensor, the proposed tensor AR can flexibly capture the underlying low-dimensional tensor dynamics, providing both substantial dimension reduction and meaningful multi-dimensional dynamic factor interpretations. For this model, we first study several nuclear-norm-regularized estimation methods and derive their non-asymptotic properties under the approximate low-rank setting. In particular, by leveraging the special balanced structure of the transition tensor, a novel convex regularization approach based on the sum of nuclear norms of square matricizations is proposed to efficiently encourage low-rankness of the coefficient tensor. To further improve the estimation efficiency under exact low-rankness, a non-convex estimator is proposed with a gradient descent algorithm, and its computational and statistical convergence guarantees are established. Simulation studies and an empirical analysis of tensor-valued time series data from multi-category import-export networks demonstrate the advantages of the proposed approach.

View on arXiv
Comments on this paper