Computing Entropies With Nested Sampling
July 12, 2017 ยท Entered Twilight ยท ๐ Entropy
"Last commit was 6.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitattributes, .gitignore, LICENSE, README.md, cpp, paper, talk
Authors
Brendon J. Brewer
arXiv ID
1707.03543
Category
stat.CO
Cross-listed
astro-ph.IM,
cs.IT,
physics.data-an
Citations
11
Venue
Entropy
Repository
https://github.com/eggplantbren/InfoNest
โญ 7
Last Checked
1 month ago
Abstract
The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ stat.CO
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Edward: A library for probabilistic modeling, inference, and criticism
R.I.P.
๐ป
Ghosted
Coresets for Scalable Bayesian Logistic Regression
R.I.P.
๐ป
Ghosted
colorspace: A Toolbox for Manipulating and Assessing Colors and Palettes
R.I.P.
๐ป
Ghosted
Fast Discrete Distribution Clustering Using Wasserstein Barycenter with Sparse Support
R.I.P.
๐ป
Ghosted