ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.04528
95
0

Hierarchical Implicit Neural Emulators

5 June 2025
Ruoxi Jiang
Xiao Zhang
Karan Jakhar
Peter Y. Lu
Pedram Hassanzadeh
Michael Maire
Rebecca Willett
    AI4CE
ArXiv (abs)PDFHTML
Abstract

Neural PDE solvers offer a powerful tool for modeling complex dynamical systems, but often struggle with error accumulation over long time horizons and maintaining stability and physical consistency. We introduce a multiscale implicit neural emulator that enhances long-term prediction accuracy by conditioning on a hierarchy of lower-dimensional future state representations. Drawing inspiration from the stability properties of numerical implicit time-stepping methods, our approach leverages predictions several steps ahead in time at increasing compression rates for next-timestep refinements. By actively adjusting the temporal downsampling ratios, our design enables the model to capture dynamics across multiple granularities and enforce long-range temporal coherence. Experiments on turbulent fluid dynamics show that our method achieves high short-term accuracy and produces long-term stable forecasts, significantly outperforming autoregressive baselines while adding minimal computational overhead.

View on arXiv
@article{jiang2025_2506.04528,
  title={ Hierarchical Implicit Neural Emulators },
  author={ Ruoxi Jiang and Xiao Zhang and Karan Jakhar and Peter Y. Lu and Pedram Hassanzadeh and Michael Maire and Rebecca Willett },
  journal={arXiv preprint arXiv:2506.04528},
  year={ 2025 }
}
Comments on this paper