On the stability properties of Gated Recurrent Units neural networks
November 13, 2020 Β· Declared Dead Β· π Systems & control letters (Print)
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Fabio Bonassi, Marcello Farina, Riccardo Scattolini
arXiv ID
2011.06806
Category
eess.SY: Systems & Control (EE)
Cross-listed
cs.LG
Citations
54
Venue
Systems & control letters (Print)
Last Checked
1 month ago
Abstract
The goal of this paper is to provide sufficient conditions for guaranteeing the Input-to-State Stability (ISS) and the Incremental Input-to-State Stability (Ξ΄ISS) of Gated Recurrent Units (GRUs) neural networks. These conditions, devised for both single-layer and multi-layer architectures, consist of nonlinear inequalities on network's weights. They can be employed to check the stability of trained networks, or can be enforced as constraints during the training procedure of a GRU. The resulting training procedure is tested on a Quadruple Tank nonlinear benchmark system, showing satisfactory modeling performances.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Systems & Control (EE)
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey
R.I.P.
π»
Ghosted
Wireless Network Design for Control Systems: A Survey
R.I.P.
π»
Ghosted
Learning-based Model Predictive Control for Safe Exploration
R.I.P.
π»
Ghosted
Safety-Critical Model Predictive Control with Discrete-Time Control Barrier Function
R.I.P.
π»
Ghosted
Novel Multidimensional Models of Opinion Dynamics in Social Networks
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted