ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.06867
39
0

Enhancing Time Series Forecasting via Logic-Inspired Regularization

10 March 2025
Jianqi Zhang
Jingyao Wang
Xingchen Shen
Wenwen Qiang
    AI4TS
ArXivPDFHTML
Abstract

Time series forecasting (TSF) plays a crucial role in many applications. Transformer-based methods are one of the mainstream techniques for TSF. Existing methods treat all token dependencies equally. However, we find that the effectiveness of token dependencies varies across different forecasting scenarios, and existing methods ignore these differences, which affects their performance. This raises two issues: (1) What are effective token dependencies? (2) How can we learn effective dependencies? From a logical perspective, we align Transformer-based TSF methods with the logical framework and define effective token dependencies as those that ensure the tokens as atomic formulas (Issue 1). We then align the learning process of Transformer methods with the process of obtaining atomic formulas in logic, which inspires us to design a method for learning these effective dependencies (Issue 2). Specifically, we propose Attention Logic Regularization (Attn-L-Reg), a plug-and-play method that guides the model to use fewer but more effective dependencies by making the attention map sparse, thereby ensuring the tokens as atomic formulas and improving prediction performance. Extensive experiments and theoretical analysis confirm the effectiveness of Attn-L-Reg.

View on arXiv
@article{zhang2025_2503.06867,
  title={ Enhancing Time Series Forecasting via Logic-Inspired Regularization },
  author={ Jianqi Zhang and Jingyao Wang and Xingchen Shen and Wenwen Qiang },
  journal={arXiv preprint arXiv:2503.06867},
  year={ 2025 }
}
Comments on this paper