Lightweight Transformer via Unrolling of Mixed Graph Algorithms for Traffic Forecast

To forecast traffic with both spatial and temporal dimensions, we unroll a mixed-graph-based optimization algorithm into a lightweight and interpretable transformer-like neural net. Specifically, we construct two graphs: an undirected graph capturing spatial correlations across geography, and a directed graph capturing sequential relationships over time. We formulate a prediction problem for the future samples of signal , assuming it is "smooth" with respect to both and , where we design new and -norm variational terms to quantify and promote signal smoothness (low-frequency reconstruction) on a directed graph. We construct an iterative algorithm based on alternating direction method of multipliers (ADMM), and unroll it into a feed-forward network for data-driven parameter learning. We insert graph learning modules for and , which are akin to the self-attention mechanism in classical transformers. Experiments show that our unrolled networks achieve competitive traffic forecast performance as state-of-the-art prediction schemes, while reducing parameter counts drastically. Our code is available inthis https URL.
View on arXiv@article{qi2025_2505.13102, title={ Lightweight Transformer via Unrolling of Mixed Graph Algorithms for Traffic Forecast }, author={ Ji Qi and Tam Thuc Do and Mingxiao Liu and Zhuoshi Pan and Yuzhe Li and Gene Cheung and H. Vicky Zhao }, journal={arXiv preprint arXiv:2505.13102}, year={ 2025 } }