22
0

Context-Aware Probabilistic Modeling with LLM for Multimodal Time Series Forecasting

Abstract

Time series forecasting is important for applications spanning energy markets, climate analysis, and traffic management. However, existing methods struggle to effectively integrate exogenous texts and align them with the probabilistic nature of large language models (LLMs). Current approaches either employ shallow text-time series fusion via basic prompts or rely on deterministic numerical decoding that conflict with LLMs' token-generation paradigm, which limits contextual awareness and distribution modeling. To address these limitations, we propose CAPTime, a context-aware probabilistic multimodal time series forecasting method that leverages text-informed abstraction and autoregressive LLM decoding. Our method first encodes temporal patterns using a pretrained time series encoder, then aligns them with textual contexts via learnable interactions to produce joint multimodal representations. By combining a mixture of distribution experts with frozen LLMs, we enable context-aware probabilistic forecasting while preserving LLMs' inherent distribution modeling capabilities. Experiments on diverse time series forecasting tasks demonstrate the superior accuracy and generalization of CAPTime, particularly in multimodal scenarios. Additional analysis highlights its robustness in data-scarce scenarios through hybrid probabilistic decoding.

View on arXiv
@article{yao2025_2505.10774,
  title={ Context-Aware Probabilistic Modeling with LLM for Multimodal Time Series Forecasting },
  author={ Yueyang Yao and Jiajun Li and Xingyuan Dai and MengMeng Zhang and Xiaoyan Gong and Fei-Yue Wang and Yisheng Lv },
  journal={arXiv preprint arXiv:2505.10774},
  year={ 2025 }
}
Comments on this paper