Spectral Statistics of Lattice Graph Percolation Models
September 26, 2016 ยท Declared Dead ยท ๐ arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Stephen Kruzick, Jose M. F. Moura
arXiv ID
1611.02655
Category
math.NA: Numerical Analysis
Cross-listed
cs.IT
Citations
1
Venue
arXiv.org
Last Checked
2 months ago
Abstract
In graph signal processing, the graph adjacency matrix or the graph Laplacian commonly define the shift operator. The spectral decomposition of the shift operator plays an important role in that the eigenvalues represent frequencies and the eigenvectors provide a spectral basis. This is useful, for example, in the design of filters. However, the graph or network may be uncertain due to stochastic influences in construction and maintenance, and, under such conditions, the eigenvalues of the shift matrix become random variables. This paper examines the spectral distribution of the eigenvalues of random networks formed by including each link of a D-dimensional lattice supergraph independently with identical probability, a percolation model. Using the stochastic canonical equation methods developed by Girko for symmetric matrices with independent upper triangular entries, a deterministic distribution is found that asymptotically approximates the empirical spectral distribution of the scaled adjacency matrix for a model with arbitrary parameters. The main results characterize the form of the solution to an important system of equations that leads to this deterministic distribution function and significantly reduce the number of equations that must be solved to find the solution for a given set of model parameters. Simulations comparing the expected empirical spectral distributions and the computed deterministic distributions are provided for sample parameters.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Numerical Analysis
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
R.I.P.
๐ป
Ghosted
PDE-Net: Learning PDEs from Data
R.I.P.
๐ป
Ghosted
Efficient tensor completion for color image and video recovery: Low-rank tensor train
R.I.P.
๐ป
Ghosted
Tensor Ring Decomposition
R.I.P.
๐ป
Ghosted
Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted