Investigating echo state networks dynamics by means of recurrence analysis
January 26, 2016 Β· Declared Dead Β· π IEEE Transactions on Neural Networks and Learning Systems
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Filippo Maria Bianchi, Lorenzo Livi, Cesare Alippi
arXiv ID
1601.07381
Category
physics.data-an
Cross-listed
cs.LG,
nlin.CD
Citations
93
Venue
IEEE Transactions on Neural Networks and Learning Systems
Last Checked
1 month ago
Abstract
In this paper, we elaborate over the well-known interpretability issue in echo state networks. The idea is to investigate the dynamics of reservoir neurons with time-series analysis techniques taken from research on complex systems. Notably, we analyze time-series of neuron activations with Recurrence Plots (RPs) and Recurrence Quantification Analysis (RQA), which permit to visualize and characterize high-dimensional dynamical systems. We show that this approach is useful in a number of ways. First, the two-dimensional representation offered by RPs provides a way for visualizing the high-dimensional dynamics of a reservoir. Our results suggest that, if the network is stable, reservoir and input denote similar line patterns in the respective RPs. Conversely, the more unstable the ESN, the more the RP of the reservoir presents instability patterns. As a second result, we show that the $\mathrm{L_{max}}$ measure is highly correlated with the well-established maximal local Lyapunov exponent. This suggests that complexity measures based on RP diagonal lines distribution provide a valuable tool to quantify the degree of network stability. Finally, our analysis shows that all RQA measures fluctuate on the proximity of the so-called edge of stability, where an ESN typically achieves maximum computational capability. We verify that the determination of the edge of stability provided by such RQA measures is more accurate than two well-known criteria based on the Jacobian matrix of the reservoir. Therefore, we claim that RPs and RQA-based analyses can be used as valuable tools to design an effective network given a specific problem.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β physics.data-an
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
A deep convolutional neural network approach to single-particle recognition in cryo-electron microscopy
R.I.P.
π»
Ghosted
The Pandora Software Development Kit for Pattern Recognition
R.I.P.
π»
Ghosted
Emergence of Compositional Representations in Restricted Boltzmann Machines
R.I.P.
π»
Ghosted
Discovering state-parameter mappings in subsurface models using generative adversarial networks
R.I.P.
π»
Ghosted
Machine Learning for Anomaly Detection in Particle Physics
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted