ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.14677
42
0

Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning

20 April 2025
Jia Liu
Cheng Jinguo
Xia Fang
Zhenyuan Ma
Yuankai Wu
    CLL
    AI4TS
    LRM
ArXivPDFHTML
Abstract

Time series foundation models excel at diverse time series forecasting tasks, but their capacity for continuous improvement through incremental learning remains unexplored. We present the first comprehensive study investigating these models' temporal plasticity - their ability to progressively enhance performance through continual learning while maintaining existing capabilities. Through experiments on real-world datasets exhibiting distribution shifts, we evaluate both conventional deep learning models and foundation models using a novel continual learning framework. Our findings reveal that while traditional models struggle with performance deterioration during incremental fine-tuning, foundation models like Time-MoE and Chronos demonstrate sustained improvement in predictive accuracy. This suggests that optimizing foundation model fine-tuning strategies may be more valuable than developing domain-specific small models. Our research introduces new evaluation methodologies and insights for developing foundation time series models with robust continuous learning capabilities.

View on arXiv
@article{liu2025_2504.14677,
  title={ Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning },
  author={ Jia Liu and Cheng Jinguo and Xia Fang and Zhenyuan Ma and Yuankai Wu },
  journal={arXiv preprint arXiv:2504.14677},
  year={ 2025 }
}
Comments on this paper