Surrogate convolutional neural network models for steady computational fluid dynamics simulations

Matthias Eichinger, Alexander Heinlein, A. Klawonn
{"title":"Surrogate convolutional neural network models for steady computational fluid dynamics simulations","authors":"Matthias Eichinger, Alexander Heinlein, A. Klawonn","doi":"10.1553/etna_vol56s235","DOIUrl":null,"url":null,"abstract":"A convolution neural network (CNN)-based approach for the construction of reduced order surrogate models for computational fluid dynamics (CFD) simulations is introduced; it is inspired by the approach of Guo, Li, and Iori [X. Guo, W. Li, and F. Iorio, Convolutional neural networks for steady flow approximation, in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, New York, USA, 2016, ACM, pp. 481–490]. In particular, the neural networks are trained in order to predict images of the flow field in a channel with varying obstacle based on an image of the geometry of the channel. A classical CNN with bottleneck structure and a U-Net are compared while varying the input format, the number of decoder paths, as well as the loss function used to train the networks. This approach yields very low prediction errors, in particular, when using the U-Net architecture. Furthermore, the models are also able to generalize to unseen geometries of the same type. A transfer learning approach enables the model to be trained to a new type of geometries with very low training cost. Finally, based on this transfer learning approach, a sequential learning strategy is introduced, which significantly reduces the amount of necessary training data.","PeriodicalId":282695,"journal":{"name":"ETNA - Electronic Transactions on Numerical Analysis","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ETNA - Electronic Transactions on Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1553/etna_vol56s235","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

A convolution neural network (CNN)-based approach for the construction of reduced order surrogate models for computational fluid dynamics (CFD) simulations is introduced; it is inspired by the approach of Guo, Li, and Iori [X. Guo, W. Li, and F. Iorio, Convolutional neural networks for steady flow approximation, in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, New York, USA, 2016, ACM, pp. 481–490]. In particular, the neural networks are trained in order to predict images of the flow field in a channel with varying obstacle based on an image of the geometry of the channel. A classical CNN with bottleneck structure and a U-Net are compared while varying the input format, the number of decoder paths, as well as the loss function used to train the networks. This approach yields very low prediction errors, in particular, when using the U-Net architecture. Furthermore, the models are also able to generalize to unseen geometries of the same type. A transfer learning approach enables the model to be trained to a new type of geometries with very low training cost. Finally, based on this transfer learning approach, a sequential learning strategy is introduced, which significantly reduces the amount of necessary training data.
用于稳定计算流体动力学模拟的代理卷积神经网络模型
介绍了一种基于卷积神经网络(CNN)构建计算流体动力学(CFD)仿真降阶代理模型的方法;它的灵感来自郭、李和Iori [X]的方法。郭,李伟,F. Iorio,卷积神经网络在稳定流近似中的应用,第22届ACM SIGKDD知识发现与数据挖掘国际会议论文集,KDD ' 16,纽约,美国,2016,ACM, pp. 481-490。特别地,训练神经网络是为了根据通道的几何图像来预测具有不同障碍物的通道中的流场图像。在改变输入格式、解码器路径数量以及用于训练网络的损失函数的情况下,对具有瓶颈结构的经典CNN和U-Net进行了比较。这种方法产生非常低的预测误差,特别是在使用U-Net体系结构时。此外,该模型还能够推广到相同类型的未见过的几何形状。迁移学习方法使模型能够以非常低的训练成本训练成一种新的几何形状。最后,在此迁移学习方法的基础上,引入了顺序学习策略,大大减少了所需的训练数据量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信