核数据评估管道的概念与软件实现

IF 2.8 1区 物理与天体物理 Q2 PHYSICS, NUCLEAR
G. Schnabel , H. Sjöstrand , J. Hansson , D. Rochman , A. Koning , R. Capote
{"title":"核数据评估管道的概念与软件实现","authors":"G. Schnabel ,&nbsp;H. Sjöstrand ,&nbsp;J. Hansson ,&nbsp;D. Rochman ,&nbsp;A. Koning ,&nbsp;R. Capote","doi":"10.1016/j.nds.2021.04.007","DOIUrl":null,"url":null,"abstract":"<div><p>We discuss the design and software implementation of a nuclear data evaluation pipeline applied for a fully reproducible evaluation of neutron-induced cross sections of <sup>56</sup>Fe above the resolved resonance region using the nuclear model code TALYS combined with relevant experimental data. The emphasis of this paper is on the mathematical and technical aspects of the pipeline and not on the evaluation of <sup>56</sup>Fe, which is tentative. The mathematical building blocks combined and employed in the pipeline are discussed in detail. In particular, an intuitive and unified representation of experimental data, systematic and statistical errors, model parameters and defects enables the application of the Generalized Least Squares (GLS) and its natural extension, the Levenberg-Marquardt (LM) algorithm, on a large collection of experimental data without the need for data reduction techniques as a preparatory step. The LM algorithm tailored to nuclear data evaluation takes into account the exact non-linear physics model to determine best estimates of nuclear quantities. Associated uncertainty information is derived from a second-order Taylor expansion at the maximum of the posterior distribution. We also discuss the pipeline in terms of its IT (=information technology) building blocks, such as those to efficiently manage and retrieve experimental data of the EXFOR library, which facilitates their appropriate correction, and to distribute computations on a scientific cluster. Relying on the mathematical and IT building blocks, we elaborate on the sequence of steps in the pipeline to perform the evaluation, such as the retrieval of experimental data, the correction of experimental uncertainties using marginal likelihood optimization (MLO) and after a screening of thousand TALYS parameters—including Gaussian process priors on energy dependent parameters—the fitting of about 150 parameters using the LM algorithm. The code of the pipeline including a manual and a Dockerfile for a simplified installation is available at <span>www.nucleardata.com</span><svg><path></path></svg>.</p></div>","PeriodicalId":49735,"journal":{"name":"Nuclear Data Sheets","volume":"173 ","pages":"Pages 239-284"},"PeriodicalIF":2.8000,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.nds.2021.04.007","citationCount":"16","resultStr":"{\"title\":\"Conception and Software Implementation of a Nuclear Data Evaluation Pipeline\",\"authors\":\"G. Schnabel ,&nbsp;H. Sjöstrand ,&nbsp;J. Hansson ,&nbsp;D. Rochman ,&nbsp;A. Koning ,&nbsp;R. Capote\",\"doi\":\"10.1016/j.nds.2021.04.007\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>We discuss the design and software implementation of a nuclear data evaluation pipeline applied for a fully reproducible evaluation of neutron-induced cross sections of <sup>56</sup>Fe above the resolved resonance region using the nuclear model code TALYS combined with relevant experimental data. The emphasis of this paper is on the mathematical and technical aspects of the pipeline and not on the evaluation of <sup>56</sup>Fe, which is tentative. The mathematical building blocks combined and employed in the pipeline are discussed in detail. In particular, an intuitive and unified representation of experimental data, systematic and statistical errors, model parameters and defects enables the application of the Generalized Least Squares (GLS) and its natural extension, the Levenberg-Marquardt (LM) algorithm, on a large collection of experimental data without the need for data reduction techniques as a preparatory step. The LM algorithm tailored to nuclear data evaluation takes into account the exact non-linear physics model to determine best estimates of nuclear quantities. Associated uncertainty information is derived from a second-order Taylor expansion at the maximum of the posterior distribution. We also discuss the pipeline in terms of its IT (=information technology) building blocks, such as those to efficiently manage and retrieve experimental data of the EXFOR library, which facilitates their appropriate correction, and to distribute computations on a scientific cluster. Relying on the mathematical and IT building blocks, we elaborate on the sequence of steps in the pipeline to perform the evaluation, such as the retrieval of experimental data, the correction of experimental uncertainties using marginal likelihood optimization (MLO) and after a screening of thousand TALYS parameters—including Gaussian process priors on energy dependent parameters—the fitting of about 150 parameters using the LM algorithm. The code of the pipeline including a manual and a Dockerfile for a simplified installation is available at <span>www.nucleardata.com</span><svg><path></path></svg>.</p></div>\",\"PeriodicalId\":49735,\"journal\":{\"name\":\"Nuclear Data Sheets\",\"volume\":\"173 \",\"pages\":\"Pages 239-284\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2021-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/j.nds.2021.04.007\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nuclear Data Sheets\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S009037522100017X\",\"RegionNum\":1,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PHYSICS, NUCLEAR\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nuclear Data Sheets","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S009037522100017X","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, NUCLEAR","Score":null,"Total":0}
引用次数: 16

摘要

我们讨论了一个核数据评估管道的设计和软件实现,该管道应用于利用核模型代码TALYS结合相关实验数据对分辨共振区以上的56Fe中子诱导截面进行完全可重复的评估。本文的重点是管道的数学和技术方面,而不是对56Fe的评价,这是尝试性的。详细讨论了在管道中组合和使用的数学构件。特别是对实验数据、系统误差和统计误差、模型参数和缺陷的直观统一表示,使得广义最小二乘(GLS)及其自然扩展Levenberg-Marquardt (LM)算法在大量实验数据上的应用无需数据约简技术作为准备步骤。为核数据评估量身定制的LM算法考虑了精确的非线性物理模型,以确定核数量的最佳估计。相关的不确定性信息由后验分布最大值处的二阶泰勒展开式导出。我们还从IT(=信息技术)构建块的角度讨论了管道,例如有效地管理和检索EXFOR库的实验数据,从而促进它们的适当更正,以及在科学集群上分配计算。依靠数学和IT构建块,我们详细说明了管道中执行评估的步骤顺序,例如检索实验数据,使用边际似然优化(MLO)校正实验不确定性,以及在筛选了数千个TALYS参数(包括能量依赖参数的高斯过程先验)之后,使用LM算法拟合了大约150个参数。管道的代码包括手册和简化安装的Dockerfile,可在www.nucleardata.com上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Conception and Software Implementation of a Nuclear Data Evaluation Pipeline

We discuss the design and software implementation of a nuclear data evaluation pipeline applied for a fully reproducible evaluation of neutron-induced cross sections of 56Fe above the resolved resonance region using the nuclear model code TALYS combined with relevant experimental data. The emphasis of this paper is on the mathematical and technical aspects of the pipeline and not on the evaluation of 56Fe, which is tentative. The mathematical building blocks combined and employed in the pipeline are discussed in detail. In particular, an intuitive and unified representation of experimental data, systematic and statistical errors, model parameters and defects enables the application of the Generalized Least Squares (GLS) and its natural extension, the Levenberg-Marquardt (LM) algorithm, on a large collection of experimental data without the need for data reduction techniques as a preparatory step. The LM algorithm tailored to nuclear data evaluation takes into account the exact non-linear physics model to determine best estimates of nuclear quantities. Associated uncertainty information is derived from a second-order Taylor expansion at the maximum of the posterior distribution. We also discuss the pipeline in terms of its IT (=information technology) building blocks, such as those to efficiently manage and retrieve experimental data of the EXFOR library, which facilitates their appropriate correction, and to distribute computations on a scientific cluster. Relying on the mathematical and IT building blocks, we elaborate on the sequence of steps in the pipeline to perform the evaluation, such as the retrieval of experimental data, the correction of experimental uncertainties using marginal likelihood optimization (MLO) and after a screening of thousand TALYS parameters—including Gaussian process priors on energy dependent parameters—the fitting of about 150 parameters using the LM algorithm. The code of the pipeline including a manual and a Dockerfile for a simplified installation is available at www.nucleardata.com.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Nuclear Data Sheets
Nuclear Data Sheets 物理-物理:核物理
CiteScore
7.80
自引率
5.40%
发文量
22
审稿时长
>12 weeks
期刊介绍: The Nuclear Data Sheets are current and are published monthly. They are devoted to compilation and evaluations of experimental and theoretical results in Nuclear Physics. The journal is mostly produced from Evaluated Nuclear Structure Data File (ENSDF), a computer file maintained by the US National Nuclear Data Center
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信