Energy-based generator matching: A neural sampler for general state space

Abstract
We propose Energy-based generator matching (EGM), a modality-agnostic approach to train generative models from energy functions in the absence of data. Extending the recently proposed generator matching, EGM enables training of arbitrary continuous-time Markov processes, e.g., diffusion, flow, and jump, and can generate data from continuous, discrete, and a mixture of two modalities. To this end, we propose estimating the generator matching loss using self-normalized importance sampling with an additional bootstrapping trick to reduce variance in the importance weight. We validate EGM on both discrete and multimodal tasks up to 100 and 20 dimensions, respectively.
View on arXiv@article{woo2025_2505.19646, title={ Energy-based generator matching: A neural sampler for general state space }, author={ Dongyeop Woo and Minsu Kim and Minkyu Kim and Kiyoung Seong and Sungsoo Ahn }, journal={arXiv preprint arXiv:2505.19646}, year={ 2025 } }
Comments on this paper