144

FRWKV:Frequency-Domain Linear Attention for Long-Term Time Series Forecasting

Qingyuan Yang
Shizhuo
Dongyue Chen
Da Teng
Zehua Gan
Main:4 Pages
1 Figures
Bibliography:2 Pages
4 Tables
Abstract

Traditional Transformers face a major bottleneck in long-sequence time series forecasting due to their quadratic complexity (O(T2))(\mathcal{O}(T^2)) and their limited ability to effectively exploit frequency-domain information. Inspired by RWKV's O(T)\mathcal{O}(T) linear attention and frequency-domain modeling, we propose FRWKV, a frequency-domain linear-attention framework that overcomes these limitations. Our model integrates linear attention mechanisms with frequency-domain analysis, achieving O(T)\mathcal{O}(T) computational complexity in the attention path while exploiting spectral information to enhance temporal feature representations for scalable long-sequence modeling. Across eight real-world datasets, FRWKV achieves a first-place average rank. Our ablation studies confirm the critical roles of both the linear attention and frequency-encoder components. This work demonstrates the powerful synergy between linear attention and frequency analysis, establishing a new paradigm for scalable time series modeling. Code is available at this repository: this https URL.

View on arXiv
Comments on this paper