Entrywise convergence of iterative methods for eigenproblems

February 19, 2020 ยท Declared Dead ยท ๐Ÿ› Neural Information Processing Systems

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Vasileios Charisopoulos, Austin R. Benson, Anil Damle arXiv ID 2002.08491 Category math.NA: Numerical Analysis Cross-listed cs.LG, cs.SI, stat.ML Citations 2 Venue Neural Information Processing Systems Last Checked 2 months ago
Abstract
Several problems in machine learning, statistics, and other fields rely on computing eigenvectors. For large scale problems, the computation of these eigenvectors is typically performed via iterative schemes such as subspace iteration or Krylov methods. While there is classical and comprehensive analysis for subspace convergence guarantees with respect to the spectral norm, in many modern applications other notions of subspace distance are more appropriate. Recent theoretical work has focused on perturbations of subspaces measured in the $\ell_{2 \to \infty}$ norm, but does not consider the actual computation of eigenvectors. Here we address the convergence of subspace iteration when distances are measured in the $\ell_{2 \to \infty}$ norm and provide deterministic bounds. We complement our analysis with a practical stopping criterion and demonstrate its applicability via numerical experiments. Our results show that one can get comparable performance on downstream tasks while requiring fewer iterations, thereby saving substantial computational time.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Numerical Analysis

R.I.P. ๐Ÿ‘ป Ghosted

Tensor Ring Decomposition

Qibin Zhao, Guoxu Zhou, ... (+3 more)

math.NA ๐Ÿ› arXiv ๐Ÿ“š 427 cites 9 years ago

Died the same way โ€” ๐Ÿ‘ป Ghosted