ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.05857
54
0

Wavelet-based Disentangled Adaptive Normalization for Non-stationary Times Series Forecasting

6 June 2025
Junpeng Lin
Tian-Shing Lan
Bo Zhang
Ke Lin
Dandan Miao
Huiru He
Jiantao Ye
Chen Zhang
Yan-fu Li
    AI4TS
ArXiv (abs)PDFHTML
Main:11 Pages
4 Figures
Bibliography:1 Pages
7 Tables
Appendix:2 Pages
Abstract

Forecasting non-stationary time series is a challenging task because their statistical properties often change over time, making it hard for deep models to generalize well. Instance-level normalization techniques can help address shifts in temporal distribution. However, most existing methods overlook the multi-component nature of time series, where different components exhibit distinct non-stationary behaviors. In this paper, we propose Wavelet-based Disentangled Adaptive Normalization (WDAN), a model-agnostic framework designed to address non-stationarity in time series forecasting. WDAN uses discrete wavelet transforms to break down the input into low-frequency trends and high-frequency fluctuations. It then applies tailored normalization strategies to each part. For trend components that exhibit strong non-stationarity, we apply first-order differencing to extract stable features used for predicting normalization parameters. Extensive experiments on multiple benchmarks demonstrate that WDAN consistently improves forecasting accuracy across various backbone model. Code is available at this repository:this https URL.

View on arXiv
@article{lin2025_2506.05857,
  title={ Wavelet-based Disentangled Adaptive Normalization for Non-stationary Times Series Forecasting },
  author={ Junpeng Lin and Tian Lan and Bo Zhang and Ke Lin and Dandan Miao and Huiru He and Jiantao Ye and Chen Zhang and Yan-fu Li },
  journal={arXiv preprint arXiv:2506.05857},
  year={ 2025 }
}
Comments on this paper