On the relationship between Koopman operator approximations and neural ordinary differential equations for data-driven time-evolution predictions
November 20, 2024 ยท Declared Dead ยท ๐ Chaos
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Jake Buzhardt, C. Ricardo Constante-Amores, Michael D. Graham
arXiv ID
2411.12940
Category
nlin.CD
Cross-listed
cs.LG
Citations
5
Venue
Chaos
Last Checked
1 month ago
Abstract
This work explores the relationship between state space methods and Koopman operator-based methods for predicting the time-evolution of nonlinear dynamical systems. We demonstrate that extended dynamic mode decomposition with dictionary learning (EDMD-DL), when combined with a state space projection, is equivalent to a neural network representation of the nonlinear discrete-time flow map on the state space. We highlight how this projection step introduces nonlinearity into the evolution equations, enabling significantly improved EDMD-DL predictions. With this projection, EDMD-DL leads to a nonlinear dynamical system on the state space, which can be represented in either discrete or continuous time. This system has a natural structure for neural networks, where the state is first expanded into a high dimensional feature space followed by a linear mapping which represents the discrete-time map or the vector field as a linear combination of these features. Inspired by these observations, we implement several variations of neural ordinary differential equations (ODEs) and EDMD-DL, developed by combining different aspects of their respective model structures and training procedures. We evaluate these methods using numerical experiments on chaotic dynamics in the Lorenz system and a nine-mode model of turbulent shear flow, showing comparable performance across methods in terms of short-time trajectory prediction, reconstruction of long-time statistics, and prediction of rare events. These results highlight the equivalence of the EDMD-DL implementation with a state space projection to a neural ODE representation of the dynamics. We also show that these methods provide comparable performance to a non-Markovian approach in terms of prediction of extreme events.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ nlin.CD
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Persistent Homology of Complex Networks for Dynamic State Detection
R.I.P.
๐ป
Ghosted
Dynamical Complexity Of Short and Noisy Time Series
R.I.P.
๐ป
Ghosted
Shannon Entropy Rate of Hidden Markov Processes
R.I.P.
๐ป
Ghosted
Theoretical design and circuit implementation of integer domain chaotic systems
R.I.P.
๐ป
Ghosted
Spectral Simplicity of Apparent Complexity, Part I: The Nondiagonalizable Metadynamics of Prediction
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted