{"title":"Nuclear data assimilation, scientific basis and current status","authors":"E. Ivanov, Cyrille De Saint-Jean, V. Sobes","doi":"10.1051/EPJN/2021008","DOIUrl":null,"url":null,"abstract":"The use of Data Assimilation methodologies, known also as a data adjustment, liaises the results of theoretical and experimental studies improving an accuracy of simulation models and giving a confidence to designers and regulation bodies. From the mathematical point of view, it approaches an optimized fit to experimental data revealing unknown causes by known consequences that would be crucial for data calibration and validation. Data assimilation adds value in a ND evaluation process, adjusting nuclear data to particular application providing so-called optimized design-oriented library, calibrating nuclear data involving IEs since all theories and differential experiments provide the only relative values, and providing an evidence-based background for validation of Nuclear data libraries substantiating the UQ process. Similarly, it valorizes experimental data and the experiments, as such involving them in a scientific turnover extracting essential information inherently contained in legacy and newly set up experiments, and prioritizing dedicated basic experimental programs. Given that a number of popular algorithms, including deterministic like Generalized Linear Least Square methodology and stochastic ones like Backward and Hierarchic or Total Monte-Carlo, Hierarchic Monte-Carlo, etc., being different in terms of particular numerical formalism are, though, commonly grounded on the Bayesian theoretical basis. They demonstrated sufficient maturity, providing optimized design-oriented data libraries or evidence-based backgrounds for a science-driven validation of general-purpose libraries in a wide range of practical applications.","PeriodicalId":44454,"journal":{"name":"EPJ Nuclear Sciences & Technologies","volume":"1 1","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"EPJ Nuclear Sciences & Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/EPJN/2021008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"NUCLEAR SCIENCE & TECHNOLOGY","Score":null,"Total":0}
引用次数: 1
Abstract
The use of Data Assimilation methodologies, known also as a data adjustment, liaises the results of theoretical and experimental studies improving an accuracy of simulation models and giving a confidence to designers and regulation bodies. From the mathematical point of view, it approaches an optimized fit to experimental data revealing unknown causes by known consequences that would be crucial for data calibration and validation. Data assimilation adds value in a ND evaluation process, adjusting nuclear data to particular application providing so-called optimized design-oriented library, calibrating nuclear data involving IEs since all theories and differential experiments provide the only relative values, and providing an evidence-based background for validation of Nuclear data libraries substantiating the UQ process. Similarly, it valorizes experimental data and the experiments, as such involving them in a scientific turnover extracting essential information inherently contained in legacy and newly set up experiments, and prioritizing dedicated basic experimental programs. Given that a number of popular algorithms, including deterministic like Generalized Linear Least Square methodology and stochastic ones like Backward and Hierarchic or Total Monte-Carlo, Hierarchic Monte-Carlo, etc., being different in terms of particular numerical formalism are, though, commonly grounded on the Bayesian theoretical basis. They demonstrated sufficient maturity, providing optimized design-oriented data libraries or evidence-based backgrounds for a science-driven validation of general-purpose libraries in a wide range of practical applications.