ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.06874
8
13

Langevin Incremental Mixture Importance Sampling

21 November 2016
Matteo Fasiolo
Flávio Eler De Melo
Simon Maskell
ArXivPDFHTML
Abstract

This work proposes a novel method through which local information about the target density can be used to construct an efficient importance sampler. The backbone of the proposed method is the Incremental Mixture Importance Sampling (IMIS) algorithm of Raftery and Bao (2010), which builds a mixture importance distribution incrementally, by positioning new mixture components where the importance density lacks mass, relative to the target. The key innovation proposed here is that the mixture components used by IMIS are local approximations to the target density. In particular, their mean vectors and covariance matrices are constructed by numerically solving certain differential equations, whose solution depends on the gradient field of the target log-density. The new sampler has a number of advantages: a) it provides an extremely parsimonious parametrization of the mixture importance density, whose configuration effectively depends only on the shape of the target and on a single free parameter representing pseudo-time; b) it scales well with the dimensionality of the target; c) it can deal with targets that are not log- concave. The performance of the proposed approach is demonstrated on a synthetic non-Gaussian multimodal density, defined on up to eighty dimensions, and on a Bayesian logistic regression model, using the Sonar data-set. The Julia code implementing the importance sampler proposed here can be found at https:/github.com/mfasiolo/LIMIS.

View on arXiv
Comments on this paper