Random Laplacian matrices and convex relaxations
April 15, 2015 Β· Declared Dead Β· π Foundations of Computational Mathematics
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Afonso S. Bandeira
arXiv ID
1504.03987
Category
math.PR
Cross-listed
cs.DS,
cs.SI,
math.OC
Citations
105
Venue
Foundations of Computational Mathematics
Last Checked
1 month ago
Abstract
The largest eigenvalue of a matrix is always larger or equal than its largest diagonal entry. We show that for a large class of random Laplacian matrices, this bound is essentially tight: the largest eigenvalue is, up to lower order terms, often the size of the largest diagonal entry. Besides being a simple tool to obtain precise estimates on the largest eigenvalue of a large class of random Laplacian matrices, our main result settles a number of open problems related to the tightness of certain convex relaxation-based algorithms. It easily implies the optimality of the semidefinite relaxation approaches to problems such as $\mathbb{Z}_2$ Synchronization and Stochastic Block Model recovery. Interestingly, this result readily implies the connectivity threshold for ErdΕs-RΓ©nyi graphs and suggests that these three phenomena are manifestations of the same underlying principle. The main tool is a recent estimate on the spectral norm of matrices with independent entries by van Handel and the author.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.PR
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
An Introduction to Matrix Concentration Inequalities
R.I.P.
π»
Ghosted
Non-backtracking spectrum of random graphs: community detection and non-regular Ramanujan graphs
R.I.P.
π»
Ghosted
Convergence of the Deep BSDE Method for Coupled FBSDEs
R.I.P.
π»
Ghosted
A Random Matrix Approach to Neural Networks
R.I.P.
π»
Ghosted
Concentration and regularization of random graphs
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted