65
6

Language in the Flow of Time: Time-Series-Paired Texts Weaved into a Unified Temporal Narrative

Abstract

While many advances in time series models focus exclusively on numerical data, research on multimodal time series, particularly those involving contextual textual information commonly encountered in real-world scenarios, remains in its infancy. With recent progress in large language models and time series learning, we revisit the integration of paired texts with time series through the Platonic Representation Hypothesis, which posits that representations of different modalities converge to shared spaces. In this context, we identify that time-series-paired texts may naturally exhibit periodic properties that closely mirror those of the original time series. Building on this insight, we propose a novel framework, Texts as Time Series (TaTS), which considers the time-series-paired texts to be auxiliary variables of the time series. TaTS can be plugged into any existing numerical-only time series models and enable them to handle time series data with paired texts effectively. Through extensive experiments on both multimodal time series forecasting and imputation tasks across benchmark datasets with various existing time series models, we demonstrate that TaTS can enhance predictive performance without modifying model architectures. Code available atthis https URL.

View on arXiv
@article{li2025_2502.08942,
  title={ Language in the Flow of Time: Time-Series-Paired Texts Weaved into a Unified Temporal Narrative },
  author={ Zihao Li and Xiao Lin and Zhining Liu and Jiaru Zou and Ziwei Wu and Lecheng Zheng and Dongqi Fu and Yada Zhu and Hendrik Hamann and Hanghang Tong and Jingrui He },
  journal={arXiv preprint arXiv:2502.08942},
  year={ 2025 }
}
Comments on this paper