ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.14543
7
0

Time to Embed: Unlocking Foundation Models for Time Series with Channel Descriptions

20 May 2025
Utsav Dutta
Sina Khoshfetrat Pakazad
Henrik Ohlsson
    AI4TS
    AIFin
ArXivPDFHTML
Abstract

Traditional time series models are task-specific and often depend on dataset-specific training and extensive feature engineering. While Transformer-based architectures have improved scalability, foundation models, commonplace in text, vision, and audio, remain under-explored for time series and are largely restricted to forecasting. We introduce CHARM\textbf{CHARM}CHARM, a foundation embedding model for multivariate time series that learns shared, transferable, and domain-aware representations. To address the unique difficulties of time series foundation learning, CHARM\textbf{CHARM}CHARM incorporates architectural innovations that integrate channel-level textual descriptions while remaining invariant to channel order. The model is trained using a Joint Embedding Predictive Architecture (JEPA), with novel augmentation schemes and a loss function designed to improve interpretability and training stability. Our 777M-parameter model achieves state-of-the-art performance across diverse downstream tasks, setting a new benchmark for time series representation learning.

View on arXiv
@article{dutta2025_2505.14543,
  title={ Time to Embed: Unlocking Foundation Models for Time Series with Channel Descriptions },
  author={ Utsav Dutta and Sina Khoshfetrat Pakazad and Henrik Ohlsson },
  journal={arXiv preprint arXiv:2505.14543},
  year={ 2025 }
}
Comments on this paper