{"title":"Extrapolating from neural network models: a cautionary tale","authors":"A. Pastore, M. Carnini","doi":"10.1088/1361-6471/abf08a","DOIUrl":null,"url":null,"abstract":"We present three different methods to estimate error bars on the predictions made using a neural network. All of them represent lower bounds for the extrapolation errors. For example, we did not include an analysis on robustness against small perturbations of the input data. \nAt first, we illustrate the methods through a simple toy model, then, we apply them to some realistic cases related to nuclear masses. By using theoretical data simulated either with a liquid-drop model or a Skyrme energy density functional, we benchmark the extrapolation performance of the neural network in regions of the Segre chart far away from the ones used for the training and validation. Finally, we discuss how error bars can help identifying when the extrapolation becomes too uncertain and thus unreliable","PeriodicalId":8463,"journal":{"name":"arXiv: Nuclear Theory","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Nuclear Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1361-6471/abf08a","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
We present three different methods to estimate error bars on the predictions made using a neural network. All of them represent lower bounds for the extrapolation errors. For example, we did not include an analysis on robustness against small perturbations of the input data.
At first, we illustrate the methods through a simple toy model, then, we apply them to some realistic cases related to nuclear masses. By using theoretical data simulated either with a liquid-drop model or a Skyrme energy density functional, we benchmark the extrapolation performance of the neural network in regions of the Segre chart far away from the ones used for the training and validation. Finally, we discuss how error bars can help identifying when the extrapolation becomes too uncertain and thus unreliable