G. Schnabel , H. Sjöstrand , J. Hansson , D. Rochman , A. Koning , R. Capote
{"title":"Conception and Software Implementation of a Nuclear Data Evaluation Pipeline","authors":"G. Schnabel , H. Sjöstrand , J. Hansson , D. Rochman , A. Koning , R. Capote","doi":"10.1016/j.nds.2021.04.007","DOIUrl":null,"url":null,"abstract":"<div><p>We discuss the design and software implementation of a nuclear data evaluation pipeline applied for a fully reproducible evaluation of neutron-induced cross sections of <sup>56</sup>Fe above the resolved resonance region using the nuclear model code TALYS combined with relevant experimental data. The emphasis of this paper is on the mathematical and technical aspects of the pipeline and not on the evaluation of <sup>56</sup>Fe, which is tentative. The mathematical building blocks combined and employed in the pipeline are discussed in detail. In particular, an intuitive and unified representation of experimental data, systematic and statistical errors, model parameters and defects enables the application of the Generalized Least Squares (GLS) and its natural extension, the Levenberg-Marquardt (LM) algorithm, on a large collection of experimental data without the need for data reduction techniques as a preparatory step. The LM algorithm tailored to nuclear data evaluation takes into account the exact non-linear physics model to determine best estimates of nuclear quantities. Associated uncertainty information is derived from a second-order Taylor expansion at the maximum of the posterior distribution. We also discuss the pipeline in terms of its IT (=information technology) building blocks, such as those to efficiently manage and retrieve experimental data of the EXFOR library, which facilitates their appropriate correction, and to distribute computations on a scientific cluster. Relying on the mathematical and IT building blocks, we elaborate on the sequence of steps in the pipeline to perform the evaluation, such as the retrieval of experimental data, the correction of experimental uncertainties using marginal likelihood optimization (MLO) and after a screening of thousand TALYS parameters—including Gaussian process priors on energy dependent parameters—the fitting of about 150 parameters using the LM algorithm. The code of the pipeline including a manual and a Dockerfile for a simplified installation is available at <span>www.nucleardata.com</span><svg><path></path></svg>.</p></div>","PeriodicalId":49735,"journal":{"name":"Nuclear Data Sheets","volume":"173 ","pages":"Pages 239-284"},"PeriodicalIF":2.8000,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.nds.2021.04.007","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nuclear Data Sheets","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S009037522100017X","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, NUCLEAR","Score":null,"Total":0}
引用次数: 16
Abstract
We discuss the design and software implementation of a nuclear data evaluation pipeline applied for a fully reproducible evaluation of neutron-induced cross sections of 56Fe above the resolved resonance region using the nuclear model code TALYS combined with relevant experimental data. The emphasis of this paper is on the mathematical and technical aspects of the pipeline and not on the evaluation of 56Fe, which is tentative. The mathematical building blocks combined and employed in the pipeline are discussed in detail. In particular, an intuitive and unified representation of experimental data, systematic and statistical errors, model parameters and defects enables the application of the Generalized Least Squares (GLS) and its natural extension, the Levenberg-Marquardt (LM) algorithm, on a large collection of experimental data without the need for data reduction techniques as a preparatory step. The LM algorithm tailored to nuclear data evaluation takes into account the exact non-linear physics model to determine best estimates of nuclear quantities. Associated uncertainty information is derived from a second-order Taylor expansion at the maximum of the posterior distribution. We also discuss the pipeline in terms of its IT (=information technology) building blocks, such as those to efficiently manage and retrieve experimental data of the EXFOR library, which facilitates their appropriate correction, and to distribute computations on a scientific cluster. Relying on the mathematical and IT building blocks, we elaborate on the sequence of steps in the pipeline to perform the evaluation, such as the retrieval of experimental data, the correction of experimental uncertainties using marginal likelihood optimization (MLO) and after a screening of thousand TALYS parameters—including Gaussian process priors on energy dependent parameters—the fitting of about 150 parameters using the LM algorithm. The code of the pipeline including a manual and a Dockerfile for a simplified installation is available at www.nucleardata.com.
期刊介绍:
The Nuclear Data Sheets are current and are published monthly. They are devoted to compilation and evaluations of experimental and theoretical results in Nuclear Physics. The journal is mostly produced from Evaluated Nuclear Structure Data File (ENSDF), a computer file maintained by the US National Nuclear Data Center