Topological mixture estimation
December 12, 2017 Β· Declared Dead Β· π International Conference on Machine Learning
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Steve Huntsman
arXiv ID
1712.04487
Category
stat.ME
Cross-listed
cs.IT,
math.AT,
math.ST
Citations
7
Venue
International Conference on Machine Learning
Last Checked
1 month ago
Abstract
Density functions that represent sample data are often multimodal, i.e. they exhibit more than one maximum. Typically this behavior is taken to indicate that the underlying data deserves a more detailed representation as a mixture of densities with individually simpler structure. The usual specification of a component density is quite restrictive, with log-concave the most general case considered in the literature, and Gaussian the overwhelmingly typical case. It is also necessary to determine the number of mixture components \emph{a priori}, and much art is devoted to this. Here, we introduce \emph{topological mixture estimation}, a completely nonparametric and computationally efficient solution to the one-dimensional problem where mixture components need only be unimodal. We repeatedly perturb the unimodal decomposition of Baryshnikov and Ghrist to produce a topologically and information-theoretically optimal unimodal mixture. We also detail a smoothing process that optimally exploits topological persistence of the unimodal category in a natural way when working directly with sample data. Finally, we illustrate these techniques through examples.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β stat.ME
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Performance Metrics (Error Measures) in Machine Learning Regression, Forecasting and Prognostics: Properties and Typology
R.I.P.
π»
Ghosted
External Validity: From Do-Calculus to Transportability Across Populations
R.I.P.
π»
Ghosted
Least Ambiguous Set-Valued Classifiers with Bounded Error Levels
R.I.P.
π»
Ghosted
Doubly Robust Policy Evaluation and Optimization
R.I.P.
π»
Ghosted
Comparison of Bayesian predictive methods for model selection
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted