带自举深度集成的自信神经网络回归

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Laurens Sluijterman , Eric Cator , Tom Heskes
{"title":"带自举深度集成的自信神经网络回归","authors":"Laurens Sluijterman ,&nbsp;Eric Cator ,&nbsp;Tom Heskes","doi":"10.1016/j.neucom.2025.131500","DOIUrl":null,"url":null,"abstract":"<div><div>With the rise in the popularity and usage of neural networks, trustworthy uncertainty estimation is becoming increasingly essential. One of the most prominent uncertainty estimation methods is <em>Deep Ensembles</em> [20]. A classical parametric model has uncertainty in the parameters due to the fact that the data on which the model is built is a random sample. A modern neural network has an additional uncertainty component since the optimization of the network is random. Lakshminarayanan et al. [20] noted that Deep Ensembles do not incorporate the classical uncertainty induced by the effect of finite data. In this paper, we present a computationally cheap extension of Deep Ensembles for the regression setting, called <em>Bootstrapped Deep Ensembles</em>, that explicitly takes this classical effect of finite data into account using a modified version of the parametric bootstrap. We demonstrate through an experimental study that our method significantly improves upon standard Deep Ensembles. The resulting confidence intervals demonstrate superior coverage without sacrificing accuracy.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"656 ","pages":"Article 131500"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Confident neural network regression with Bootstrapped Deep Ensembles\",\"authors\":\"Laurens Sluijterman ,&nbsp;Eric Cator ,&nbsp;Tom Heskes\",\"doi\":\"10.1016/j.neucom.2025.131500\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>With the rise in the popularity and usage of neural networks, trustworthy uncertainty estimation is becoming increasingly essential. One of the most prominent uncertainty estimation methods is <em>Deep Ensembles</em> [20]. A classical parametric model has uncertainty in the parameters due to the fact that the data on which the model is built is a random sample. A modern neural network has an additional uncertainty component since the optimization of the network is random. Lakshminarayanan et al. [20] noted that Deep Ensembles do not incorporate the classical uncertainty induced by the effect of finite data. In this paper, we present a computationally cheap extension of Deep Ensembles for the regression setting, called <em>Bootstrapped Deep Ensembles</em>, that explicitly takes this classical effect of finite data into account using a modified version of the parametric bootstrap. We demonstrate through an experimental study that our method significantly improves upon standard Deep Ensembles. The resulting confidence intervals demonstrate superior coverage without sacrificing accuracy.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"656 \",\"pages\":\"Article 131500\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2025-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225021721\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225021721","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

随着神经网络的普及和使用,可信的不确定性估计变得越来越重要。其中最突出的不确定性估计方法是深度集成[20]。由于建立模型的数据是随机样本,经典参数模型的参数具有不确定性。由于网络的优化是随机的,因此现代神经网络具有额外的不确定性成分。Lakshminarayanan等人指出,深度集成不包含由有限数据效应引起的经典不确定性。在本文中,我们提出了一种用于回归设置的计算廉价的深度集成扩展,称为自举深度集成,它使用参数自举的修改版本明确地考虑了有限数据的这种经典效应。我们通过实验研究证明,我们的方法显着改善了标准的深集成。所得到的置信区间在不牺牲准确性的情况下展示了优越的覆盖率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Confident neural network regression with Bootstrapped Deep Ensembles
With the rise in the popularity and usage of neural networks, trustworthy uncertainty estimation is becoming increasingly essential. One of the most prominent uncertainty estimation methods is Deep Ensembles [20]. A classical parametric model has uncertainty in the parameters due to the fact that the data on which the model is built is a random sample. A modern neural network has an additional uncertainty component since the optimization of the network is random. Lakshminarayanan et al. [20] noted that Deep Ensembles do not incorporate the classical uncertainty induced by the effect of finite data. In this paper, we present a computationally cheap extension of Deep Ensembles for the regression setting, called Bootstrapped Deep Ensembles, that explicitly takes this classical effect of finite data into account using a modified version of the parametric bootstrap. We demonstrate through an experimental study that our method significantly improves upon standard Deep Ensembles. The resulting confidence intervals demonstrate superior coverage without sacrificing accuracy.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信