Layered Adaptive Importance Sampling
May 18, 2015 ยท Declared Dead ยท ๐ Statistics and computing
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
L. Martino, V. Elvira, D. Luengo, J. Corander
arXiv ID
1505.04732
Category
stat.CO
Cross-listed
cs.LG,
stat.ML
Citations
110
Venue
Statistics and computing
Last Checked
1 month ago
Abstract
Monte Carlo methods represent the "de facto" standard for approximating complicated integrals involving multidimensional target distributions. In order to generate random realizations from the target distribution, Monte Carlo techniques use simpler proposal probability densities to draw candidate samples. The performance of any such method is strictly related to the specification of the proposal distribution, such that unfortunate choices easily wreak havoc on the resulting estimators. In this work, we introduce a layered (i.e., hierarchical) procedure to generate samples employed within a Monte Carlo scheme. This approach ensures that an appropriate equivalent proposal density is always obtained automatically (thus eliminating the risk of a catastrophic performance), although at the expense of a moderate increase in the complexity. Furthermore, we provide a general unified importance sampling (IS) framework, where multiple proposal densities are employed and several IS schemes are introduced by applying the so-called deterministic mixture approach. Finally, given these schemes, we also propose a novel class of adaptive importance samplers using a population of proposals, where the adaptation is driven by independent parallel or interacting Markov Chain Monte Carlo (MCMC) chains. The resulting algorithms efficiently combine the benefits of both IS and MCMC methods.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ stat.CO
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Edward: A library for probabilistic modeling, inference, and criticism
R.I.P.
๐ป
Ghosted
Coresets for Scalable Bayesian Logistic Regression
R.I.P.
๐ป
Ghosted
colorspace: A Toolbox for Manipulating and Assessing Colors and Palettes
R.I.P.
๐ป
Ghosted
Fast Discrete Distribution Clustering Using Wasserstein Barycenter with Sparse Support
R.I.P.
๐ป
Ghosted
Poisson multi-Bernoulli conjugate prior for multiple extended object filtering
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted