{"title":"Unsteady-state turbulent flow field predictions with a convolutional autoencoder architecture","authors":"Álvaro Abucide, Koldo Portal, Unai Fernandez-Gamiz, Ekaitz Zulueta, Iker Azurmendi","doi":"10.3934/math.20231522","DOIUrl":null,"url":null,"abstract":"<abstract> <p>Traditional numerical methods, such as computational fluid dynamics (CFD), demand large computational resources and memory for modeling fluid dynamic systems. Hence, deep learning (DL) and, specifically Convolutional Neural Networks (CNN) autoencoders have resulted in accurate tools to obtain approximations of the streamwise and vertical velocities and pressure fields, when stationary flows are considered. The novelty of this paper consists of predicting the future instants from an initial one with a CNN autoencoder architecture when an unsteady flow is considered. Two neural models are proposed: The former predicts the future instants on the basis of an initial sample and the latter approximates the initial sample. The inputs of the CNNs take the signed distance function (SDF) and the flow region channel (FRC), and, for the representation of the temporal evolution, the previous CFD sample is added. To increment the amount of training data of the second neural model, a data augmentation technique based on the similarity principle for fluid dynamics is implemented. As a result, low absolute error rates are obtained in the prediction of the first samples near the shapes surfaces. Even in the most advanced time instants, the prediction of the vortices zone is quite reliable. 62.12 and 9000 speed-up ratios are achieved by the predictions of the first and second neural models, respectively, compared to the computational cost regarded by the CFD simulations.</p> </abstract>","PeriodicalId":48562,"journal":{"name":"AIMS Mathematics","volume":"159 1","pages":"0"},"PeriodicalIF":1.8000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AIMS Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3934/math.20231522","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
Abstract
Traditional numerical methods, such as computational fluid dynamics (CFD), demand large computational resources and memory for modeling fluid dynamic systems. Hence, deep learning (DL) and, specifically Convolutional Neural Networks (CNN) autoencoders have resulted in accurate tools to obtain approximations of the streamwise and vertical velocities and pressure fields, when stationary flows are considered. The novelty of this paper consists of predicting the future instants from an initial one with a CNN autoencoder architecture when an unsteady flow is considered. Two neural models are proposed: The former predicts the future instants on the basis of an initial sample and the latter approximates the initial sample. The inputs of the CNNs take the signed distance function (SDF) and the flow region channel (FRC), and, for the representation of the temporal evolution, the previous CFD sample is added. To increment the amount of training data of the second neural model, a data augmentation technique based on the similarity principle for fluid dynamics is implemented. As a result, low absolute error rates are obtained in the prediction of the first samples near the shapes surfaces. Even in the most advanced time instants, the prediction of the vortices zone is quite reliable. 62.12 and 9000 speed-up ratios are achieved by the predictions of the first and second neural models, respectively, compared to the computational cost regarded by the CFD simulations.
期刊介绍:
AIMS Mathematics is an international Open Access journal devoted to publishing peer-reviewed, high quality, original papers in all fields of mathematics. We publish the following article types: original research articles, reviews, editorials, letters, and conference reports.