Computing Entropies With Nested Sampling

July 12, 2017 ยท Entered Twilight ยท ๐Ÿ› Entropy

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 6.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitattributes, .gitignore, LICENSE, README.md, cpp, paper, talk

Authors Brendon J. Brewer arXiv ID 1707.03543 Category stat.CO Cross-listed astro-ph.IM, cs.IT, physics.data-an Citations 11 Venue Entropy Repository https://github.com/eggplantbren/InfoNest โญ 7 Last Checked 1 month ago
Abstract
The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” stat.CO