A Riemannian Framework for Statistical Analysis of Topological Persistence Diagrams
May 28, 2016 Β· Declared Dead Β· π 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Rushil Anirudh, Vinay Venkataraman, Karthikeyan Natesan Ramamurthy, Pavan Turaga
arXiv ID
1605.08912
Category
math.AT
Cross-listed
cs.CG,
cs.CV,
math.DG,
math.ST
Citations
39
Venue
2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
Last Checked
1 month ago
Abstract
Topological data analysis is becoming a popular way to study high dimensional feature spaces without any contextual clues or assumptions. This paper concerns itself with one popular topological feature, which is the number of $d-$dimensional holes in the dataset, also known as the Betti$-d$ number. The persistence of the Betti numbers over various scales is encoded into a persistence diagram (PD), which indicates the birth and death times of these holes as scale varies. A common way to compare PDs is by a point-to-point matching, which is given by the $n$-Wasserstein metric. However, a big drawback of this approach is the need to solve correspondence between points before computing the distance; for $n$ points, the complexity grows according to $\mathcal{O}($n$^3)$. Instead, we propose to use an entirely new framework built on Riemannian geometry, that models PDs as 2D probability density functions that are represented in the square-root framework on a Hilbert Sphere. The resulting space is much more intuitive with closed form expressions for common operations. The distance metric is 1) correspondence-free and also 2) independent of the number of points in the dataset. The complexity of computing distance between PDs now grows according to $\mathcal{O}(K^2)$, for a $K \times K$ discretization of $[0,1]^2$. This also enables the use of existing machinery in differential geometry towards statistical analysis of PDs such as computing the mean, geodesics, classification etc. We report competitive results with the Wasserstein metric, at a much lower computational load, indicating the favorable properties of the proposed approach.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.AT
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Persistence Diagrams with Linear Machine Learning Models
R.I.P.
π»
Ghosted
Comparing persistence diagrams through complex vectors
R.I.P.
π»
Ghosted
Path homologies of deep feedforward networks
R.I.P.
π»
Ghosted
From trees to barcodes and back again: theoretical and statistical perspectives
R.I.P.
π»
Ghosted
Parametrized topological complexity of collision-free motion planning in the plane
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted