Beyond Pham's algorithm for joint diagonalization
November 28, 2018 Β· Declared Dead Β· π The European Symposium on Artificial Neural Networks
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Pierre Ablin, Jean-FranΓ§ois Cardoso, Alexandre Gramfort
arXiv ID
1811.11433
Category
math.NA: Numerical Analysis
Cross-listed
cs.LG,
stat.ML
Citations
16
Venue
The European Symposium on Artificial Neural Networks
Last Checked
1 month ago
Abstract
The approximate joint diagonalization of a set of matrices consists in finding a basis in which these matrices are as diagonal as possible. This problem naturally appears in several statistical learning tasks such as blind signal separation. We consider the diagonalization criterion studied in a seminal paper by Pham (2001), and propose a new quasi-Newton method for its optimization. Through numerical experiments on simulated and real datasets, we show that the proposed method outper-forms Pham's algorithm. An open source Python package is released.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Numerical Analysis
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
R.I.P.
π»
Ghosted
PDE-Net: Learning PDEs from Data
R.I.P.
π»
Ghosted
Efficient tensor completion for color image and video recovery: Low-rank tensor train
R.I.P.
π»
Ghosted
Tensor Ring Decomposition
R.I.P.
π»
Ghosted
Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted