We study the applicability of tools developed by the computer vision community for feature learning and semantic image inpainting to perform data reconstruction of fluid turbulence configurations. The aim is twofold. First, we explore on a quantitative basis the capability of convolutional neural networks embedded in a deep generative adversarial model (deep-GAN) to generate missing data in turbulence, a paradigmatic high dimensional chaotic system. In particular, we investigate their use in reconstructing two-dimensional damaged snapshots extracted from a large database of numerical configurations of three-dimensional turbulence in the presence of rotation, a case with multiscale random features where both large-scale organized structures and small-scale highly intermittent and non-Gaussian fluctuations are present. Second, following a reverse engineering approach, we aim to rank the input flow properties (features) in terms of their qualitative and quantitative importance to obtain a better set of reconstructed fields. We present two approaches both based on context encoders. The first one infers the missing data via a minimization of the L z pixel-wise reconstruction loss, plus a small adversarial penalization. The second, searches for the closest encoding of the corrupted flow configuration from a previously trained generator. Finally, we present a comparison with a different data assimilation tool, based on Nudging, an equation-informed unbiased protocol, well known in the numerical weather prediction community. The TURB-Rot database of roughly 300 K two-dimensional turbulent images is released and details on how to download it are given.
Buzzicotti, M., Bonaccorso, F., Di Leoni, P.c., Biferale, L. (2021). Reconstruction of turbulent data with deep generative models for semantic inpainting from TURB-Rot database. PHYSICAL REVIEW FLUIDS, 6(5) [10.1103/PhysRevFluids.6.050503].
Reconstruction of turbulent data with deep generative models for semantic inpainting from TURB-Rot database
Buzzicotti M.;Biferale L.
2021-01-01
Abstract
We study the applicability of tools developed by the computer vision community for feature learning and semantic image inpainting to perform data reconstruction of fluid turbulence configurations. The aim is twofold. First, we explore on a quantitative basis the capability of convolutional neural networks embedded in a deep generative adversarial model (deep-GAN) to generate missing data in turbulence, a paradigmatic high dimensional chaotic system. In particular, we investigate their use in reconstructing two-dimensional damaged snapshots extracted from a large database of numerical configurations of three-dimensional turbulence in the presence of rotation, a case with multiscale random features where both large-scale organized structures and small-scale highly intermittent and non-Gaussian fluctuations are present. Second, following a reverse engineering approach, we aim to rank the input flow properties (features) in terms of their qualitative and quantitative importance to obtain a better set of reconstructed fields. We present two approaches both based on context encoders. The first one infers the missing data via a minimization of the L z pixel-wise reconstruction loss, plus a small adversarial penalization. The second, searches for the closest encoding of the corrupted flow configuration from a previously trained generator. Finally, we present a comparison with a different data assimilation tool, based on Nudging, an equation-informed unbiased protocol, well known in the numerical weather prediction community. The TURB-Rot database of roughly 300 K two-dimensional turbulent images is released and details on how to download it are given.File | Dimensione | Formato | |
---|---|---|---|
PhysRevFluids.6.050503.pdf
solo utenti autorizzati
Tipologia:
Versione Editoriale (PDF)
Licenza:
Copyright dell'editore
Dimensione
9.03 MB
Formato
Adobe PDF
|
9.03 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.