50
3
v1v2 (latest)

IN-Flow: Instance Normalization Flow for Non-stationary Time Series Forecasting

Abstract

Due to the non-stationarity of time series, the distribution shift problem largely hinders the performance of time series forecasting. Existing solutions either rely on using certain statistics to specify the shift, or developing specific mechanisms for certain network architectures. However, the former would fail for the unknown shift beyond simple statistics, while the latter has limited compatibility on different forecasting models. To overcome these problems, we first propose a decoupled formulation for time series forecasting, with no reliance on fixed statistics and no restriction on forecasting architectures. This formulation regards the removing-shift procedure as a special transformation between a raw distribution and a desired target distribution and separates it from the forecasting. Such a formulation is further formalized into a bi-level optimization problem, to enable the joint learning of the transformation (outer loop) and forecasting (inner loop). Moreover, the special requirements of expressiveness and bi-direction for the transformation motivate us to propose instance normalization flow (IN-Flow), a novel invertible network for time series transformation. Different from the classic "normalizing flow" models, IN-Flow does not aim for normalizing input to the prior distribution (e.g., Gaussian distribution) for generation, but creatively transforms time series distribution by stacking normalization layers and flow-based invertible networks, which is thus named "normalization" flow. Finally, we have conducted extensive experiments on both synthetic data and real-world data, which demonstrate the superiority of our method.

View on arXiv
@article{fan2025_2401.16777,
  title={ IN-Flow: Instance Normalization Flow for Non-stationary Time Series Forecasting },
  author={ Wei Fan and Shun Zheng and Pengyang Wang and Rui Xie and Kun Yi and Qi Zhang and Jiang Bian and Yanjie Fu },
  journal={arXiv preprint arXiv:2401.16777},
  year={ 2025 }
}
Comments on this paper