ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.00349
64
0

Reducing Smoothness with Expressive Memory Enhanced Hierarchical Graph Neural Networks

1 April 2025
Thomas Bailie
Yun Sing Koh
S. Karthik Mukkavilli
V. Vetrova
    AI4TS
ArXivPDFHTML
Abstract

Graphical forecasting models learn the structure of time series data via projecting onto a graph, with recent techniques capturing spatial-temporal associations between variables via edge weights. Hierarchical variants offer a distinct advantage by analysing the time series across multiple resolutions, making them particularly effective in tasks like global weather forecasting, where low-resolution variable interactions are significant. A critical challenge in hierarchical models is information loss during forward or backward passes through the hierarchy. We propose the Hierarchical Graph Flow (HiGFlow) network, which introduces a memory buffer variable of dynamic size to store previously seen information across variable resolutions. We theoretically show two key results: HiGFlow reduces smoothness when mapping onto new feature spaces in the hierarchy and non-strictly enhances the utility of message-passing by improving Weisfeiler-Lehman (WL) expressivity. Empirical results demonstrate that HiGFlow outperforms state-of-the-art baselines, including transformer models, by at least an average of 6.1% in MAE and 6.2% in RMSE. Code is available atthis https URLthis http URL.

View on arXiv
@article{bailie2025_2504.00349,
  title={ Reducing Smoothness with Expressive Memory Enhanced Hierarchical Graph Neural Networks },
  author={ Thomas Bailie and Yun Sing Koh and S. Karthik Mukkavilli and Varvara Vetrova },
  journal={arXiv preprint arXiv:2504.00349},
  year={ 2025 }
}
Comments on this paper