Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices
March 20, 2019 Β· Declared Dead Β· π Neural Information Processing Systems
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Santosh S. Vempala, Andre Wibisono
arXiv ID
1903.08568
Category
cs.DS: Data Structures & Algorithms
Cross-listed
cs.LG,
math.PR,
stat.ML
Citations
323
Venue
Neural Information Processing Systems
Last Checked
1 month ago
Abstract
We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution $Ξ½= e^{-f}$ on $\mathbb{R}^n$. We prove a convergence guarantee in Kullback-Leibler (KL) divergence assuming $Ξ½$ satisfies a log-Sobolev inequality and the Hessian of $f$ is bounded. Notably, we do not assume convexity or bounds on higher derivatives. We also prove convergence guarantees in RΓ©nyi divergence of order $q > 1$ assuming the limit of ULA satisfies either the log-Sobolev or PoincarΓ© inequality. We also prove a bound on the bias of the limiting distribution of ULA assuming third-order smoothness of $f$, without requiring isoperimetry.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Data Structures & Algorithms
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Relief-Based Feature Selection: Introduction and Review
R.I.P.
π»
Ghosted
Route Planning in Transportation Networks
R.I.P.
π»
Ghosted
Near-linear time approximation algorithms for optimal transport via Sinkhorn iteration
R.I.P.
π»
Ghosted
Hierarchical Clustering: Objective Functions and Algorithms
R.I.P.
π»
Ghosted
Graph Isomorphism in Quasipolynomial Time
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted