Comparison of Generative Learning Methods for Turbulence Surrogates
November 25, 2024 ยท Declared Dead ยท + Add venue
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Claudia Drygala, Edmund Ross, Mohammad Sharifi Ghazijahani, Christian Cierpka, Francesca di Mare, Hanno Gottschalk
arXiv ID
2411.16417
Category
physics.flu-dyn
Cross-listed
cs.CV
Citations
0
Last Checked
1 month ago
Abstract
Numerical simulations of turbulent flows present significant challenges in fluid dynamics due to their complexity and high computational cost. High resolution techniques such as Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) are generally not computationally affordable, particularly for technologically relevant problems. Recent advances in machine learning, specifically in generative probabilistic models, offer promising alternatives as surrogates for turbulence. This paper investigates the application of three generative models - Variational Autoencoders (VAE), Deep Convolutional Generative Adversarial Networks (DCGAN), and Denoising Diffusion Probabilistic Models (DDPM) - in simulating a von Kรกrmรกn vortex street around a fixed cylinder projected into 2D, as well as a real-world experimental dataset of the wake flow of a cylinder array. Training data was obtained by means of LES in the simulated case and Particle Image Velocimetry (PIV) in the experimental case. We evaluate each model's ability to capture the statistical properties and spatial structures of the turbulent flow. Our results demonstrate that DDPM and DCGAN effectively replicate all flow distributions, highlighting their potential as efficient and accurate tools for turbulence surrogacy. We find a strong argument for DCGAN, as although they are more difficult to train (due to problems such as mode collapse), they show the fastest inference and training time, require less data to train compared to VAE and DDPM, and provide the results most closely aligned with the input stream. In contrast, VAE train quickly (and can generate samples quickly) but do not produce adequate results, and DDPM, whilst effective, are significantly slower at both, inference and training time.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ physics.flu-dyn
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Efficient collective swimming by harnessing vortices through deep reinforcement learning
R.I.P.
๐ป
Ghosted
NVIDIA SimNet^{TM}: an AI-accelerated multi-physics simulation framework
R.I.P.
๐ป
Ghosted
Teaching the Incompressible Navier-Stokes Equations to Fast Neural Surrogate Models in 3D
R.I.P.
๐ป
Ghosted
Prediction of Reynolds Stresses in High-Mach-Number Turbulent Boundary Layers using Physics-Informed Machine Learning
R.I.P.
๐ป
Ghosted
From Deep to Physics-Informed Learning of Turbulence: Diagnostics
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted