Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
March 07, 2017 Β· Declared Dead Β· π arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Thomas B. SchΓΆn, Andreas Svensson, Lawrence Murray, Fredrik Lindsten
arXiv ID
1703.02419
Category
stat.CO
Cross-listed
cs.LG,
eess.SY
Citations
42
Venue
arXiv.org
Last Checked
1 month ago
Abstract
Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods---the particle Metropolis--Hastings algorithm---which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis--Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods---including particle Metropolis--Hastings---to a large group of users without requiring them to know all the underlying mathematical details.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β stat.CO
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Edward: A library for probabilistic modeling, inference, and criticism
R.I.P.
π»
Ghosted
Coresets for Scalable Bayesian Logistic Regression
R.I.P.
π»
Ghosted
colorspace: A Toolbox for Manipulating and Assessing Colors and Palettes
R.I.P.
π»
Ghosted
Fast Discrete Distribution Clustering Using Wasserstein Barycenter with Sparse Support
R.I.P.
π»
Ghosted
Poisson multi-Bernoulli conjugate prior for multiple extended object filtering
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted