Mesh-Informed Neural Operator : A Transformer Generative Approach
- AI4CE

Generative models in function spaces, situated at the intersection of generative modeling and operator learning, are attracting increasing attention due to their immense potential in diverse scientific and engineering applications. While functional generative models are theoretically domain- and discretization-agnostic, current implementations heavily rely on the Fourier Neural Operator (FNO), limiting their applicability to regular grids and rectangular domains. To overcome these critical limitations, we introduce the Mesh-Informed Neural Operator (MINO). By leveraging graph neural operators and cross-attention mechanisms, MINO offers a principled, domain- and discretization-agnostic backbone for generative modeling in function spaces. This advancement significantly expands the scope of such models to more diverse applications in generative, inverse, and regression tasks. Furthermore, MINO provides a unified perspective on integrating neural operators with general advanced deep learning architectures. Finally, we introduce a suite of standardized evaluation metrics that enable objective comparison of functional generative models, addressing another critical gap in the field.
View on arXiv@article{shi2025_2506.16656, title={ Mesh-Informed Neural Operator : A Transformer Generative Approach }, author={ Yaozhong Shi and Zachary E. Ross and Domniki Asimaki and Kamyar Azizzadenesheli }, journal={arXiv preprint arXiv:2506.16656}, year={ 2025 } }