A statistical modelling approach to feedforward neural network model selection

IF 1.2 4区 数学 Q2 STATISTICS & PROBABILITY
Andrew McInerney, Kevin Burke
{"title":"A statistical modelling approach to feedforward neural network model selection","authors":"Andrew McInerney, Kevin Burke","doi":"10.1177/1471082x241258261","DOIUrl":null,"url":null,"abstract":"Feedforward neural networks (FNNs) can be viewed as non-linear regression models, where covariates enter the model through a combination of weighted summations and non-linear functions. Although these models have some similarities to the approaches used within statistical modelling, the majority of neural network research has been conducted outside of the field of statistics. This has resulted in a lack of statistically based methodology, and, in particular, there has been little emphasis on model parsimony. Determining the input layer structure is analogous to variable selection, while the structure for the hidden layer relates to model complexity. In practice, neural network model selection is often carried out by comparing models using out-of-sample performance. However, in contrast, the construction of an associated likelihood function opens the door to information-criteria-based variable and architecture selection. A novel model selection method, which performs both input- and hidden-node selection, is proposed using the Bayesian information criterion (BIC) for FNNs. The choice of BIC over out-of-sample performance as the model selection objective function leads to an increased probability of recovering the true model, while parsimoniously achieving favourable out-of-sample performance. Simulation studies are used to evaluate and justify the proposed method, and applications on real data are investigated.","PeriodicalId":49476,"journal":{"name":"Statistical Modelling","volume":"119 1","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistical Modelling","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1177/1471082x241258261","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0

Abstract

Feedforward neural networks (FNNs) can be viewed as non-linear regression models, where covariates enter the model through a combination of weighted summations and non-linear functions. Although these models have some similarities to the approaches used within statistical modelling, the majority of neural network research has been conducted outside of the field of statistics. This has resulted in a lack of statistically based methodology, and, in particular, there has been little emphasis on model parsimony. Determining the input layer structure is analogous to variable selection, while the structure for the hidden layer relates to model complexity. In practice, neural network model selection is often carried out by comparing models using out-of-sample performance. However, in contrast, the construction of an associated likelihood function opens the door to information-criteria-based variable and architecture selection. A novel model selection method, which performs both input- and hidden-node selection, is proposed using the Bayesian information criterion (BIC) for FNNs. The choice of BIC over out-of-sample performance as the model selection objective function leads to an increased probability of recovering the true model, while parsimoniously achieving favourable out-of-sample performance. Simulation studies are used to evaluate and justify the proposed method, and applications on real data are investigated.
前馈神经网络模型选择的统计建模方法
前馈神经网络(FNN)可视为非线性回归模型,其中协变量通过加权求和与非线性函数的组合进入模型。虽然这些模型与统计建模中使用的方法有一些相似之处,但大多数神经网络研究都是在统计领域之外进行的。这导致缺乏基于统计学的方法,尤其是很少强调模型的简约性。确定输入层结构类似于变量选择,而隐藏层结构则与模型复杂性有关。在实践中,神经网络模型的选择通常是通过比较模型的样本外性能来实现的。然而,与此相反,相关似然函数的构建为基于信息标准的变量和结构选择打开了大门。本文提出了一种新的模型选择方法,该方法利用贝叶斯信息准则(BIC)对 FNN 进行输入节点和隐藏节点选择。选择 BIC 而不是样本外性能作为模型选择的目标函数,可提高恢复真实模型的概率,同时实现良好的样本外性能。仿真研究用于评估和论证所提出的方法,并对真实数据的应用进行了调查。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Statistical Modelling
Statistical Modelling 数学-统计学与概率论
CiteScore
2.20
自引率
0.00%
发文量
16
审稿时长
>12 weeks
期刊介绍: The primary aim of the journal is to publish original and high-quality articles that recognize statistical modelling as the general framework for the application of statistical ideas. Submissions must reflect important developments, extensions, and applications in statistical modelling. The journal also encourages submissions that describe scientifically interesting, complex or novel statistical modelling aspects from a wide diversity of disciplines, and submissions that embrace the diversity of applied statistical modelling.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信