Margaux Boxho, Thomas Toulorge, Michel Rasquin, Grégory Dergham, Koen Hillewaert
{"title":"Low Cost Recurrent and Asymptotically Unbiased Estimators of Statistical Uncertainty on Averaged Fields for DNS and LES","authors":"Margaux Boxho, Thomas Toulorge, Michel Rasquin, Grégory Dergham, Koen Hillewaert","doi":"10.1007/s10494-024-00556-0","DOIUrl":null,"url":null,"abstract":"<div><p>In the context of fundamental flow studies, experimental databases are expected to provide uncertainty margins on the measured quantities. With the rapid increase in available computational power and the development of high-resolution fluid simulation techniques, Direct Numerical Simulation and Large Eddy Simulation are increasingly used in synergy with experiments to provide a complementary view. Moreover, they can access statistical moments of the flow variables for the development, calibration, and validation of turbulence models. In this context, the quantification of statistical errors is also essential for numerical studies. Reliable estimation of these errors poses two challenges. The first challenge is the very large amount of data: the simulation can provide a large number of quantities of interest (typically about 180 quantities) over the entire domain (typically 100 million to 10 billion of degrees of freedom per equation). Ideally, one would like to quantify the error for each quantity at any point in the flow field. However, storing a long-term sequence of signals from many quantities over the entire domain for a posteriori evaluation is prohibitively expensive. The second challenge is the short time step required to resolve turbulent flows with DNS and LES. As a direct consequence, consecutive samples within the time series are highly correlated. To overcome both challenges, a novel economical co-processing approach to estimate statistical errors is proposed, based on a recursive formula and the rolling storage of short-time signals.</p></div>","PeriodicalId":559,"journal":{"name":"Flow, Turbulence and Combustion","volume":"113 2","pages":"331 - 361"},"PeriodicalIF":2.0000,"publicationDate":"2024-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Flow, Turbulence and Combustion","FirstCategoryId":"5","ListUrlMain":"https://link.springer.com/article/10.1007/s10494-024-00556-0","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MECHANICS","Score":null,"Total":0}
引用次数: 0
Abstract
In the context of fundamental flow studies, experimental databases are expected to provide uncertainty margins on the measured quantities. With the rapid increase in available computational power and the development of high-resolution fluid simulation techniques, Direct Numerical Simulation and Large Eddy Simulation are increasingly used in synergy with experiments to provide a complementary view. Moreover, they can access statistical moments of the flow variables for the development, calibration, and validation of turbulence models. In this context, the quantification of statistical errors is also essential for numerical studies. Reliable estimation of these errors poses two challenges. The first challenge is the very large amount of data: the simulation can provide a large number of quantities of interest (typically about 180 quantities) over the entire domain (typically 100 million to 10 billion of degrees of freedom per equation). Ideally, one would like to quantify the error for each quantity at any point in the flow field. However, storing a long-term sequence of signals from many quantities over the entire domain for a posteriori evaluation is prohibitively expensive. The second challenge is the short time step required to resolve turbulent flows with DNS and LES. As a direct consequence, consecutive samples within the time series are highly correlated. To overcome both challenges, a novel economical co-processing approach to estimate statistical errors is proposed, based on a recursive formula and the rolling storage of short-time signals.
期刊介绍:
Flow, Turbulence and Combustion provides a global forum for the publication of original and innovative research results that contribute to the solution of fundamental and applied problems encountered in single-phase, multi-phase and reacting flows, in both idealized and real systems. The scope of coverage encompasses topics in fluid dynamics, scalar transport, multi-physics interactions and flow control. From time to time the journal publishes Special or Theme Issues featuring invited articles.
Contributions may report research that falls within the broad spectrum of analytical, computational and experimental methods. This includes research conducted in academia, industry and a variety of environmental and geophysical sectors. Turbulence, transition and associated phenomena are expected to play a significant role in the majority of studies reported, although non-turbulent flows, typical of those in micro-devices, would be regarded as falling within the scope covered. The emphasis is on originality, timeliness, quality and thematic fit, as exemplified by the title of the journal and the qualifications described above. Relevance to real-world problems and industrial applications are regarded as strengths.