ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.18857
74
0

Hierarchical-embedding autoencoder with a predictor (HEAP) as efficient architecture for learning long-term evolution of complex multi-scale physical systems

24 May 2025
Alexander Khrabry
Edward Startsev
Andrew Powis
Igor Kaganovich
    AI4CE
ArXiv (abs)PDFHTML
Main:8 Pages
14 Figures
Bibliography:6 Pages
Appendix:6 Pages
Abstract

We propose a novel efficient architecture for learning long-term evolution in complex multi-scale physical systems which is based on the idea of separation of scales. Structures of various scales that dynamically emerge in the system interact with each other only locally. Structures of similar scale can interact directly when they are in contact and indirectly when they are parts of larger structures that interact directly. This enables modeling a multi-scale system in an efficient way, where interactions between small-scale features that are apart from each other do not need to be modeled. The hierarchical fully-convolutional autoencoder transforms the state of a physical system not just into a single embedding layer, as it is done conventionally, but into a series of embedding layers which encode structures of various scales preserving spatial information at a corresponding resolution level. Shallower layers embed smaller structures on a finer grid, while deeper layers embed larger structures on a coarser grid. The predictor advances all embedding layers in sync. Interactions between features of various scales are modeled using a combination of convolutional operators. We compare the performance of our model to variations of a conventional ResNet architecture in application to the Hasegawa-Wakatani turbulence. A multifold improvement in long-term prediction accuracy was observed for crucial statistical characteristics of this system.

View on arXiv
@article{khrabry2025_2505.18857,
  title={ Hierarchical-embedding autoencoder with a predictor (HEAP) as efficient architecture for learning long-term evolution of complex multi-scale physical systems },
  author={ Alexander Khrabry and Edward Startsev and Andrew Powis and Igor Kaganovich },
  journal={arXiv preprint arXiv:2505.18857},
  year={ 2025 }
}
Comments on this paper