7
0

Lightweight Transformer via Unrolling of Mixed Graph Algorithms for Traffic Forecast

Abstract

To forecast traffic with both spatial and temporal dimensions, we unroll a mixed-graph-based optimization algorithm into a lightweight and interpretable transformer-like neural net. Specifically, we construct two graphs: an undirected graph Gu\mathcal{G}^u capturing spatial correlations across geography, and a directed graph Gd\mathcal{G}^d capturing sequential relationships over time. We formulate a prediction problem for the future samples of signal x\mathbf{x}, assuming it is "smooth" with respect to both Gu\mathcal{G}^u and Gd\mathcal{G}^d, where we design new 2\ell_2 and 1\ell_1-norm variational terms to quantify and promote signal smoothness (low-frequency reconstruction) on a directed graph. We construct an iterative algorithm based on alternating direction method of multipliers (ADMM), and unroll it into a feed-forward network for data-driven parameter learning. We insert graph learning modules for Gu\mathcal{G}^u and Gd\mathcal{G}^d, which are akin to the self-attention mechanism in classical transformers. Experiments show that our unrolled networks achieve competitive traffic forecast performance as state-of-the-art prediction schemes, while reducing parameter counts drastically. Our code is available inthis https URL.

View on arXiv
@article{qi2025_2505.13102,
  title={ Lightweight Transformer via Unrolling of Mixed Graph Algorithms for Traffic Forecast },
  author={ Ji Qi and Tam Thuc Do and Mingxiao Liu and Zhuoshi Pan and Yuzhe Li and Gene Cheung and H. Vicky Zhao },
  journal={arXiv preprint arXiv:2505.13102},
  year={ 2025 }
}
Comments on this paper