Learning Subgrid-scale Models with Neural Ordinary Differential Equations

December 20, 2022 ยท Declared Dead ยท ๐Ÿ› Computers & Fluids

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Shinhoo Kang, Emil M. Constantinescu arXiv ID 2212.09967 Category math.NA: Numerical Analysis Cross-listed cs.CE, cs.LG Citations 9 Venue Computers & Fluids Last Checked 1 month ago
Abstract
We propose a new approach to learning the subgrid-scale model when simulating partial differential equations (PDEs) solved by the method of lines and their representation in chaotic ordinary differential equations, based on neural ordinary differential equations (NODEs). Solving systems with fine temporal and spatial grid scales is an ongoing computational challenge, and closure models are generally difficult to tune. Machine learning approaches have increased the accuracy and efficiency of computational fluid dynamics solvers. In this approach neural networks are used to learn the coarse- to fine-grid map, which can be viewed as subgrid-scale parameterization. We propose a strategy that uses the NODE and partial knowledge to learn the source dynamics at a continuous level. Our method inherits the advantages of NODEs and can be used to parameterize subgrid scales, approximate coupling operators, and improve the efficiency of low-order solvers. Numerical results with the two-scale Lorenz 96 ODE, the convection-diffusion PDE, and the viscous Burgers' PDE are used to illustrate this approach.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Numerical Analysis

R.I.P. ๐Ÿ‘ป Ghosted

Tensor Ring Decomposition

Qibin Zhao, Guoxu Zhou, ... (+3 more)

math.NA ๐Ÿ› arXiv ๐Ÿ“š 427 cites 9 years ago

Died the same way โ€” ๐Ÿ‘ป Ghosted