Glow: Generative Flow with Invertible 1x1 Convolutions
July 09, 2018 Β· Entered Twilight Β· π Neural Information Processing Systems
"Last commit was 5.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, LICENSE, README.md, data_loaders, demo, graphics.py, memory_saving_gradients.py, model.py, optim.py, requirements.txt, tfops.py, train.py, utils.py
Authors
Diederik P. Kingma, Prafulla Dhariwal
arXiv ID
1807.03039
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.AI,
cs.LG
Citations
3.5K
Venue
Neural Information Processing Systems
Repository
https://github.com/openai/glow
β 3180
Last Checked
1 month ago
Abstract
Flow-based generative models (Dinh et al., 2014) are conceptually attractive due to tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis. In this paper we propose Glow, a simple type of generative flow using an invertible 1x1 convolution. Using our method we demonstrate a significant improvement in log-likelihood on standard benchmarks. Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. The code for our model is available at https://github.com/openai/glow
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning (Stat)
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
π»
Ghosted
Layer Normalization
R.I.P.
π»
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
π»
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
π»
Ghosted