紊流通道流动的隐式分解变压器快速预测方法

IF 7.5 1区 物理与天体物理 Q1 PHYSICS, MULTIDISCIPLINARY
Huiyu Yang, Yunpeng Wang, Jianchun Wang
{"title":"紊流通道流动的隐式分解变压器快速预测方法","authors":"Huiyu Yang,&nbsp;Yunpeng Wang,&nbsp;Jianchun Wang","doi":"10.1007/s11433-024-2666-9","DOIUrl":null,"url":null,"abstract":"<div><p>Transformer neural operators have recently become an effective approach for surrogate modeling of systems governed by partial differential equations (PDEs). In this paper, we introduce a modified implicit factorized transformer (IFactFormer-m) model, replacing the original chained factorized attention with parallel factorized attention. The IFactFormer-m model successfully performs long-term predictions for turbulent channel flow. In contrast, the original IFactFormer (IFactFormer-o), Fourier neural operator (FNO), and implicit Fourier neural operator (IFNO) exhibit a poor performance. Turbulent channel flows are simulated by direct numerical simulation using fine grids at friction Reynolds numbers <i>Re</i><sub><i>τ</i></sub> ≈ 180, 395, 590, and filtered to coarse grids for training neural operator. The neural operator takes the current flow field as input and predicts the flow field at the next time step, and long-term prediction is achieved in the posterior through an autoregressive approach. The results show that IFactFormer-m, compared with other neural operators and the traditional large eddy simulation (LES) methods, including the dynamic Smagorinsky model (DSM) and the wall-adapted local eddy-viscosity (WALE) model, reduces prediction errors in the short term, and achieves stable and accurate long-term prediction of various statistical properties and flow structures, including the energy spectrum, mean streamwise velocity, root mean square (RMS) values of fluctuating velocities, Reynolds shear stress, and spatial structures of instantaneous velocity. Moreover, the trained IFactFormer-m is much faster than traditional LES methods. By analyzing the attention kernels, we elucidate why IFactFormer-m converges faster and achieves a stable and accurate long-term prediction compared with IFactFormer-o. Code and data are available at: https://github.com/huiyu-2002/IFactFormer-m.</p></div>","PeriodicalId":774,"journal":{"name":"Science China Physics, Mechanics & Astronomy","volume":"69 1","pages":""},"PeriodicalIF":7.5000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Implicit factorized transformer approach to fast prediction of turbulent channel flows\",\"authors\":\"Huiyu Yang,&nbsp;Yunpeng Wang,&nbsp;Jianchun Wang\",\"doi\":\"10.1007/s11433-024-2666-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Transformer neural operators have recently become an effective approach for surrogate modeling of systems governed by partial differential equations (PDEs). In this paper, we introduce a modified implicit factorized transformer (IFactFormer-m) model, replacing the original chained factorized attention with parallel factorized attention. The IFactFormer-m model successfully performs long-term predictions for turbulent channel flow. In contrast, the original IFactFormer (IFactFormer-o), Fourier neural operator (FNO), and implicit Fourier neural operator (IFNO) exhibit a poor performance. Turbulent channel flows are simulated by direct numerical simulation using fine grids at friction Reynolds numbers <i>Re</i><sub><i>τ</i></sub> ≈ 180, 395, 590, and filtered to coarse grids for training neural operator. The neural operator takes the current flow field as input and predicts the flow field at the next time step, and long-term prediction is achieved in the posterior through an autoregressive approach. The results show that IFactFormer-m, compared with other neural operators and the traditional large eddy simulation (LES) methods, including the dynamic Smagorinsky model (DSM) and the wall-adapted local eddy-viscosity (WALE) model, reduces prediction errors in the short term, and achieves stable and accurate long-term prediction of various statistical properties and flow structures, including the energy spectrum, mean streamwise velocity, root mean square (RMS) values of fluctuating velocities, Reynolds shear stress, and spatial structures of instantaneous velocity. Moreover, the trained IFactFormer-m is much faster than traditional LES methods. By analyzing the attention kernels, we elucidate why IFactFormer-m converges faster and achieves a stable and accurate long-term prediction compared with IFactFormer-o. Code and data are available at: https://github.com/huiyu-2002/IFactFormer-m.</p></div>\",\"PeriodicalId\":774,\"journal\":{\"name\":\"Science China Physics, Mechanics & Astronomy\",\"volume\":\"69 1\",\"pages\":\"\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Science China Physics, Mechanics & Astronomy\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s11433-024-2666-9\",\"RegionNum\":1,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PHYSICS, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science China Physics, Mechanics & Astronomy","FirstCategoryId":"101","ListUrlMain":"https://link.springer.com/article/10.1007/s11433-024-2666-9","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

近年来,变压器神经算子已成为一种有效的偏微分方程(PDEs)控制系统的代理建模方法。本文引入了一种改进的隐式因式变压器(IFactFormer-m)模型,将原有的链式因式注意替换为并行因式注意。IFactFormer-m模型成功地对湍流河道流动进行了长期预测。相比之下,原始的IFactFormer (IFactFormer-o)、傅立叶神经算子(FNO)和隐式傅立叶神经算子(IFNO)表现出较差的性能。在摩擦雷诺数Reτ≈180,395,590处,采用细网格直接数值模拟湍流通道流动,并过滤到粗网格用于训练神经算子。神经算子以当前流场为输入,对下一时刻的流场进行预测,并通过自回归方法在后验实现长期预测。结果表明,与其他神经算子和传统的大涡模拟(LES)方法(包括动态Smagorinsky模型(DSM)和壁面自适应局部涡粘(WALE)模型)相比,IFactFormer-m在短期内减少了预测误差,并在长期内实现了对各种统计特性和流结构(包括能量谱、平均流向速度、波动速度、雷诺剪应力和瞬时速度的空间结构的均方根值。此外,经过训练的IFactFormer-m比传统的LES方法要快得多。通过对注意力核的分析,我们阐明了IFactFormer-m比IFactFormer-o收敛速度更快、长期预测稳定准确的原因。代码和数据可在:https://github.com/huiyu-2002/IFactFormer-m。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Implicit factorized transformer approach to fast prediction of turbulent channel flows

Transformer neural operators have recently become an effective approach for surrogate modeling of systems governed by partial differential equations (PDEs). In this paper, we introduce a modified implicit factorized transformer (IFactFormer-m) model, replacing the original chained factorized attention with parallel factorized attention. The IFactFormer-m model successfully performs long-term predictions for turbulent channel flow. In contrast, the original IFactFormer (IFactFormer-o), Fourier neural operator (FNO), and implicit Fourier neural operator (IFNO) exhibit a poor performance. Turbulent channel flows are simulated by direct numerical simulation using fine grids at friction Reynolds numbers Reτ ≈ 180, 395, 590, and filtered to coarse grids for training neural operator. The neural operator takes the current flow field as input and predicts the flow field at the next time step, and long-term prediction is achieved in the posterior through an autoregressive approach. The results show that IFactFormer-m, compared with other neural operators and the traditional large eddy simulation (LES) methods, including the dynamic Smagorinsky model (DSM) and the wall-adapted local eddy-viscosity (WALE) model, reduces prediction errors in the short term, and achieves stable and accurate long-term prediction of various statistical properties and flow structures, including the energy spectrum, mean streamwise velocity, root mean square (RMS) values of fluctuating velocities, Reynolds shear stress, and spatial structures of instantaneous velocity. Moreover, the trained IFactFormer-m is much faster than traditional LES methods. By analyzing the attention kernels, we elucidate why IFactFormer-m converges faster and achieves a stable and accurate long-term prediction compared with IFactFormer-o. Code and data are available at: https://github.com/huiyu-2002/IFactFormer-m.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Science China Physics, Mechanics & Astronomy
Science China Physics, Mechanics & Astronomy PHYSICS, MULTIDISCIPLINARY-
CiteScore
10.30
自引率
6.20%
发文量
4047
审稿时长
3 months
期刊介绍: Science China Physics, Mechanics & Astronomy, an academic journal cosponsored by the Chinese Academy of Sciences and the National Natural Science Foundation of China, and published by Science China Press, is committed to publishing high-quality, original results in both basic and applied research. Science China Physics, Mechanics & Astronomy, is published in both print and electronic forms. It is indexed by Science Citation Index. Categories of articles: Reviews summarize representative results and achievements in a particular topic or an area, comment on the current state of research, and advise on the research directions. The author’s own opinion and related discussion is requested. Research papers report on important original results in all areas of physics, mechanics and astronomy. Brief reports present short reports in a timely manner of the latest important results.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信