116
0

Scalable Expectation Estimation with Subtractive Mixture Models

Main:8 Pages
6 Figures
Bibliography:3 Pages
3 Tables
Appendix:6 Pages
Abstract

Many Monte Carlo (MC) and importance sampling (IS) methods use mixture models (MMs) for their simplicity and ability to capture multimodal distributions. Recently, subtractive mixture models (SMMs), i.e. MMs with negative coefficients, have shown greater expressiveness and success in generative modeling. However, their negative parameters complicate sampling, requiring costly auto-regressive techniques or accept-reject algorithms that do not scale in high dimensions. In this work, we use the difference representation of SMMs to construct an unbiased IS estimator (ΔEx\Delta\text{Ex}) that removes the need to sample from the SMM, enabling high-dimensional expectation estimation with SMMs. In our experiments, we show that ΔEx\Delta\text{Ex} can achieve comparable estimation quality to auto-regressive sampling while being considerably faster in MC estimation. Moreover, we conduct initial experiments with ΔEx\Delta\text{Ex} using hand-crafted proposals, gaining first insights into how to construct safe proposals for ΔEx\Delta\text{Ex}.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.