Inverse Kernel Decomposition

November 11, 2022 ยท Declared Dead ยท ๐Ÿ› Trans. Mach. Learn. Res.

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Chengrui Li, Anqi Wu arXiv ID 2211.05961 Category cs.LG: Machine Learning Cross-listed stat.ML Citations 1 Venue Trans. Mach. Learn. Res. Repository https://github.com/JerrySoybean/ikd} Last Checked 2 months ago
Abstract
The state-of-the-art dimensionality reduction approaches largely rely on complicated optimization procedures. On the other hand, closed-form approaches requiring merely eigen-decomposition do not have enough sophistication and nonlinearity. In this paper, we propose a novel nonlinear dimensionality reduction method -- Inverse Kernel Decomposition (IKD) -- based on an eigen-decomposition of the sample covariance matrix of data. The method is inspired by Gaussian process latent variable models (GPLVMs) and has comparable performance with GPLVMs. To deal with very noisy data with weak correlations, we propose two solutions -- blockwise and geodesic -- to make use of locally correlated data points and provide better and numerically more stable latent estimations. We use synthetic datasets and four real-world datasets to show that IKD is a better dimensionality reduction method than other eigen-decomposition-based methods, and achieves comparable performance against optimization-based methods with faster running speeds. Open-source IKD implementation in Python can be accessed at this \url{https://github.com/JerrySoybean/ikd}.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ’€ 404 Not Found