16
0

Symplectic Generative Networks (SGNs): A Hamiltonian Framework for Invertible Deep Generative Modeling

Main:34 Pages
6 Figures
Bibliography:3 Pages
Abstract

We introduce the Symplectic Generative Network (SGN), a deep generative model that leverages Hamiltonian mechanics to construct an invertible, volume-preserving mapping between a latent space and the data space. By endowing the latent space with a symplectic structure and modeling data generation as the time evolution of a Hamiltonian system, SGN achieves exact likelihood evaluation without incurring the computational overhead of Jacobian determinant calculations. In this work, we provide a rigorous mathematical foundation for SGNs through a comprehensive theoretical framework that includes: (i) complete proofs of invertibility and volume preservation, (ii) a formal complexity analysis with theoretical comparisons to Variational Autoencoders and Normalizing Flows, (iii) strengthened universal approximation results with quantitative error bounds, (iv) an information-theoretic analysis based on the geometry of statistical manifolds, and (v) an extensive stability analysis with adaptive integration guarantees. These contributions highlight the fundamental advantages of SGNs and establish a solid foundation for future empirical investigations and applications to complex, high-dimensional data.

View on arXiv
@article{aich2025_2505.22527,
  title={ Symplectic Generative Networks (SGNs): A Hamiltonian Framework for Invertible Deep Generative Modeling },
  author={ Agnideep Aich and Ashit Aich and Bruce Wade },
  journal={arXiv preprint arXiv:2505.22527},
  year={ 2025 }
}
Comments on this paper