Rényi Divergence Variational Inference
February 06, 2016 · Declared Dead · 🏛 Neural Information Processing Systems
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Yingzhen Li, Richard E. Turner
arXiv ID
1602.02311
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.LG
Citations
287
Venue
Neural Information Processing Systems
Last Checked
1 month ago
Abstract
This paper introduces the variational Rényi bound (VR) that extends traditional variational inference to Rényi's alpha-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of alpha that parametrises the divergence. The reparameterization trick, Monte Carlo approximation and stochastic optimisation methods are deployed to obtain a tractable and unified framework for optimisation. We further consider negative alpha values and propose a novel variational inference method as a new special case in the proposed framework. Experiments on Bayesian neural networks and variational auto-encoders demonstrate the wide applicability of the VR bound.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
📜 Similar Papers
In the same crypt — Machine Learning (Stat)
R.I.P.
👻
Ghosted
R.I.P.
👻
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
👻
Ghosted
Layer Normalization
R.I.P.
👻
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
👻
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
👻
Ghosted
Deep Learning with Differential Privacy
Died the same way — 👻 Ghosted
R.I.P.
👻
Ghosted
Language Models are Few-Shot Learners
R.I.P.
👻
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
👻
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
👻
Ghosted